Aries export bug

Looks like there is a bug in the aries syntax endpoint.
when using the 'forecastStartToLatestProd flag, some wells in the page are causing the job to fail and return a 500 status code.

I imagine that there is an issue with the last prod date that is causing it to fail?

con_session = cc_session.makeSession()  #request session

project_id = '62829fdf2768d600136fcc42'
forecast_id = '6283f34e1a8c340013164fdd'

total = 1700
skip_val = 200  #100 #50
skips  = [ x for x in range(total) if x % skip_val == 0]
print('skip intervals: ', skips)
for skip in skips:
    url = ('https://api.combocurve.com/v1/'
           f'projects/{project_id}/'
           f'forecasts/{forecast_id}/'
           f'aries?skip={skip}&take={skip_val}&'
           'pSeries=best&'
           'selectedIdKey=inptID&'
           'endingCondition=ending_rate&'
           'forecastUnit=per_month&'
           'toLife=yes&'
           'dataResolution=monthly&'
           'includeZeroForecast=True&'
           'forecastStartToLatestProd=True')
    response = con_session.get(url,  data={}, verify=False)
    result = response.status_code 
    print(result, end=',')

Here are some samples

skip intervals [0, 200, 400, 600, 800, 1000, 1200, 1400, 1600]
200,200,200,500,500,500,500,500,200,

skip intervals: [0, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1500, 1600]
200,200,200,200,200,200,200,500,500,500,500,200,200,500,200,500,200,

skip intervals: [0, 50, 100, 150, 200, 250, 300, 350, 400, 450, 500, 550, 600, 650, 700, 750, 800, 850, 900, 950, 1000, 1050, 1100, 1150, 1200, 1250, 1300, 1350, 1400, 1450, 1500, 1550, 1600, 1650]
200,200,200,200,200,200,200,200,200,200,200,200,200,200,200,500,200,500,500,200,500,200,200,200,200,200,200,500,200,200,500,200,200,200,

Hey Lucas,

Some people in the team are going to look into this.

Not that is expected to solve the issue necessarily, but in the meantime I recommend trying our get_next_page_url pagination helper:

Regards

Yea, that doesn’t help. It returns none at the first fail.

url = f'{root_url}&take=200{base_params}'
print('record_count:', int(getCount(con_session,root_url)))
has_more = True
while has_more:
    response = con_session.get(url,  data={}, verify=False)
    url = get_next_page_url(response.headers)
    print(response.status_code, end=',')
    has_more = url is not None

[Out]
record_count: 1613
200,200,200,500,

@lucas.fontenelle, you found a legit bug here. We had been working on a fix that is finally being deployed. It should be live in about 20 min.

Thanks for finding and reporting.