Weird xarray error after completing backtest. [BUG]
See original GitHub issueUnfortunately, I have another bug that has arisen. Although I do not know if it is related. After running backtest code.
backtest_series = model.historical_forecasts(
series=series_transformed,
start=series_transformed.get_timestamp_at_point(0.9995),
# Split data 60 train / 40 validate-test --- value
forecast_horizon=7,
stride=5,
retrain=False,
num_samples=1,
verbose=True
)
An error occurs after predicting the entire specified length occurs
ValueError Traceback (most recent call last)
/tmp/ipykernel_2006/609873824.py in <module>
7 retrain=False,
8 num_samples=1,
----> 9 verbose=True
10 )
11 print(backtest_series)
/pdrive/dt-env/lib/python3.7/site-packages/darts/utils/utils.py in sanitized_method(self, *args, **kwargs)
170
171 getattr(self, sanity_check_method)(*only_args.values(), **only_kwargs)
--> 172 return method_to_sanitize(self, *only_args.values(), **only_kwargs)
173
174 return sanitized_method
/pdrive/dt-env/lib/python3.7/site-packages/darts/models/forecasting/forecasting_model.py in historical_forecasts(self, series, past_covariates, future_covariates, num_samples, start, forecast_horizon, stride, retrain, overlap_end, last_points_only, verbose)
443 step=1,
444 ),
--> 445 np.array(last_points_values),
446 )
447
/pdrive/dt-env/lib/python3.7/site-packages/darts/timeseries.py in from_times_and_values(cls, times, values, fill_missing_dates, freq, columns, fillna_value)
604 coords[DIMS[1]] = columns
605
--> 606 xa = xr.DataArray(values, dims=(times_name,) + DIMS[-2:], coords=coords)
607
608 return cls.from_xarray(
/pdrive/dt-env/lib/python3.7/site-packages/xarray/core/dataarray.py in __init__(self, data, coords, dims, name, attrs, indexes, fastpath)
404 data = _check_data_shape(data, coords, dims)
405 data = as_compatible_data(data)
--> 406 coords, dims = _infer_coords_and_dims(data.shape, coords, dims)
407 variable = Variable(dims, data, attrs, fastpath=True)
408 indexes = dict(
/pdrive/dt-env/lib/python3.7/site-packages/xarray/core/dataarray.py in _infer_coords_and_dims(shape, coords, dims)
153 if s != sizes[d]:
154 raise ValueError(
--> 155 f"conflicting sizes for dimension {d!r}: "
156 f"length {sizes[d]} on the data but length {s} on "
157 f"coordinate {k!r}"
ValueError: conflicting sizes for dimension 'time': length 972 on the data but length 4856 on coordinate 'time'
_Originally posted by @Joe-TheBro in https://github.com/unit8co/darts/issues/814#issuecomment-1049543735_
Issue Analytics
- State:
- Created 2 years ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
Model can't find encoding_available variable [BUG] · Issue #814
Describe the bug After fitting TCN model, when attempting to backtest program spits out ... Weird xarray error after completing backtest.
Read more >Bug #1890306 “Fix IOMMU error on AMD Radeon Pro W5700”
[Impact] AMD Radeon Pro W5700 doesn't work at all, driver failed to probe the device due to IOMMU error. [Fix]
Read more >Tradingview Strategy backtest error "no data" - Stack Overflow
I am confused as to how using this current data gives me No data in backtesting. I can see the backtesting results. I...
Read more >Oracle Linux 8 Unbreakable Enterprise kernel security update
(Michael Chan) [Orabug: 31541567] - bnxt_en: Fix PCI AER error recovery ... cancel rxtimer on multipacket broadcast session complete (Zhang ...
Read more >Mastering Python for Finance
If you find a mistake in one of our books—maybe a mistake in the text or the code—we would be grateful if you...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I can confirm this works on my linux setup with no bugs. Must be a collab issue.
Closing for now. Don’t hesitate to re-open if you meet this issue again.