Empty URLs When making Async Request
See original GitHub issueWhen running through an async request with the following script:
queued_job_ids = []
try:
for chunk_ids in split_list(ids, 20):
queued_job_ids.append(PromotedTweet.queue_async_stats_job(account, chunk_ids, metric_groups,placement = placement, granularity = granularity, start_time = start_time, end_time = end_time).id)
print(chunk_ids)
except:
pass
print(queued_job_ids)
time.sleep(30)
try:
async_stats_job_results = PromotedTweet.async_stats_job_result(account, job_ids=queued_job_ids)
except:
pass
async_data = []
count=0
for result in async_stats_job_results:
#time.sleep(15)
count+=1
print(count)
url = result.url
print(async_stats_job_results)
print(url)
print(result)
try:
async_data.append(PromotedTweet.async_stats_job_data(account, url=url))
except:
pass
I’m getting the following URL response:
<twitter_ads.cursor.Cursor object at xxxxxx>
None
<Analytics resource at xxxxxx id=xxxxx>
I’ve seen this happen before and I assume it’s latency on Twitter’s server-side that is delaying the report longer than what I am waiting for it to be returned.
I’m wondering if anyone else has seen Twitter return empty URLs when making a request and what you’ve done to ensure that each request has a JSON file that’s returned?
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:6
Top Results From Across the Web
async/await problem, when the my URLs are inserting on ...
async /await problem, when the my URLs are inserting on database I am getting empty object · await sendImagesToCloudinary() and remove res.json( ...
Read more >Synchronous and asynchronous requests - Web APIs | MDN
Asynchronous request. If you use an asynchronous XMLHttpRequest , you receive a callback when the data has been received. This lets the browser ......
Read more >Essential Guide To Asynchronous Web Scraping In Python
Learn how to create an asynchronous web scraper from scratch in pure python using asyncio and aiohttp. Practice downloading multiple webpages using Aiohttp...
Read more >How to use continuations to convert completion handlers into ...
< Why can't we call async functions using async var? How to create continuations that can throw errors > ...
Read more >Async IO in Python: A Complete Walkthrough
A Full Program: Asynchronous Requests; Async IO in Context ... Asynchronous version: Judit moves from table to table, making one move at each...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m also getting an error since December 12th in the
PromotedTweet.async_stats_job_data
as follows:Seems like the URL is not being composed correctly anymore… ? Specifically my script is hitting the error when I set this variable:
@seanpdwyer7 It varies pretty wildly. I think we have a hard cap at 10 mins and have seen it timeout (more, more recently) a number of times. But sometimes it’s much much shorter.
@tushdante I’m not certain if that’s relevant here (though feel free to correct me if it is). The 2 api methods available and called above don’t poll or block on the completion of the report. The aforementioned logic would just prevent one from going over your api call limit? Unless maybe the
_result
method will 4xx and the retry_delay and retry_status fields are relevant?