Aggregate datashape
See original GitHub issueThis is a…
[ ] Feature request
[ ] Regression (a behavior that used to work and stopped working in a new release)
[X] Bug report
[ ] Documentation issue or request
Description
I want to create this integration:
sql select
split
filter (I want only some elements from the collection)
aggregate (I want to continue with the collection)
datamapper (I want to concatenate all last names from the collection into one message)
activemq queue
When I use following flow:
1. start SQL
2. end AMQ
3. split
4. aggregate
The split has the none -> sql result
datashape (correct), but the aggregate has amq-in(json instance in my case) -> sql result
.
In this case, the aggregate should have the same datashapes as the sql select
as I only filtered out some elements.
This results in two datamappers required by the ui:
Issue Analytics
- State:
- Created 5 years ago
- Comments:25 (24 by maintainers)
Top Results From Across the Web
i want to apply aggregate on infotable - PTC Community
Hi,. I am unable to use aggregation for sum or max or min value calculations. Here is my code: var params = {...
Read more >Data Shapes - Syndesis
A datashape is a way to describe any inbound/outbound message format and to allow the user to easily map each data property in...
Read more >Aggregating Time Series | Tanzu Observability Documentation
How to aggregate points from multiple time series with or without interpolation. ... Practice – When to Use Raw or Aggregated Data. What's...
Read more >Large amount of time spent on determining datashape · Issue #633 ...
When running aggregation on a large dataset a fairly large amount of time is spent in the dshape_from_dask function (somewhere in the region...
Read more >Data shape transform - Groupby Transformation - Microsoft Q&A
After that, I intend to produce some group aggregates: Group By Id and calculate output*sum(Value), Value is another float column. I have tried...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The latest issue with mixed-up data shapes for aggregate (also reported in #4905) is fixed now but there are still some errors showing up in the Syndesis server log.
This is because the user specifies the data shape for the AMQ connection with an json-instance and the meta data variant information (collection/single element) is not given in this case. This is why the server meta lookup hits the error and the UI is provided with non inspected data shapes. So the UI has no chance to display proper collection or single element hints.
Let me fix this on the server meta lookup.
@heiko-braun In my opinion this is a blocker as all integrations where the user defines the data shapes on the connection step (AMQ, Template, Webhook etc.) might run into the problem that data shapes for split/aggregate are not inspected in a consistent way. I think this fix should be backported to 1.6.x.
@avano I think with this particular issue being fixed we have sorted out the missing collection/single element hints on split/aggregate and we are now on the same page regarding data shape handling for split/aggregate. Do you agree?
@christophd
anyway, where the datashape should be place in the case? I would expect to place it between aggregate and amq step, because that is what I want to map really.
According to your comment the datashapes should look like this:
Then the UI would require 2 datamapper as it is now - why I should add the datamapper between Split and Aggregate (the output and input datashapes do not match)?
Shouldn’t it be this way?
I want to use basic filter between
split
andaggregate
, should be the same datashape as in the initialSQL
step.When I would use some connector in between, then the datashapes should look like this according to my common sense 😃
This is how I imagine it at the first glance