flink 1.8 sql rowtime window 问题

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

flink 1.8 sql rowtime window 问题

1142632215
1.我需要统计每天某个状态下订单数量(数据源mysql binlog),就比如状态值在200-3002.如果订单状态由250变成了400,数量需要减1,因为之前状态在200-300,所以之前统计过了
3.source每来一条数据,需要触发窗口的计算及每天的订单量




由于问题2 我需要根据订单id进行聚合,使他变成RetractStream,timestamp字段在KafkaTableSource定义为rowtime
类似 select order_id ,last_value(timestamp) timestamp,last_value(order_status) order_status from order group by order_id





然后用over window 对上述结果进行统计
over(partition by df(timestamp,'yyyy-MM-dd 00:00:00') order by update_time range BETWEEN INTERVAL '24' hour preceding and current row
报错
Exception in thread "main" org.apache.flink.table.api.TableException: Retraction on Over window aggregation is not supported yet. Note: Over window aggregation should not follow a non-windowed GroupBy aggregation.


问题:
1. 像这样的场景有什么sql的解决方案没
2.last_value(rowtime) as rowtime  group by order_id 这种再按时间聚合会报错 (rowtime 不是time attribute),但按我理解rowtime作用是用于产生 watermark, rowtime虽然变了,但是watermark应该不变,stream api 中operate收到watermark直接发送给下游,不做处理