We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
解析如下spark sql会直接报错 INSERT OVERWRITE TABLE vup.ads_vendor_summary PARTITION (dt = '${dt}') --数据日期当天下单订单轻度汇总数据 SELECT store_id, COALESCE(department_id, 0) as department_id, SUM(order_num) AS order_num FROM vup.ads_vendor_detail WHERE dt = '${dt}' GROUP BY store_id, COALESCE(department_id, 0) GROUPING SETS ( (store_id, COALESCE(department_id, 0)), (COALESCE(department_id, 0)) );
The text was updated successfully, but these errors were encountered:
spark 是否支持?
Sorry, something went wrong.
No branches or pull requests
解析如下spark sql会直接报错
INSERT OVERWRITE
TABLE vup.ads_vendor_summary PARTITION (dt = '${dt}')
--数据日期当天下单订单轻度汇总数据
SELECT
store_id,
COALESCE(department_id, 0) as department_id,
SUM(order_num) AS order_num
FROM
vup.ads_vendor_detail
WHERE
dt = '${dt}'
GROUP BY
store_id,
COALESCE(department_id, 0) GROUPING SETS (
(store_id, COALESCE(department_id, 0)),
(COALESCE(department_id, 0))
);
The text was updated successfully, but these errors were encountered: