When the current utc time value is inserted through spark sqlcontext, it is inserted into the current local state, not utc

Asked 1 years ago, Updated 1 years ago, 80 views

After purifying the data frame through sql context in spark, Write the code below and attach it to jdbc write.

df_2.createOrReplaceTempView("vms_status")
q = """SELECT
    srv_name,
    srv_serial,
    groups_id,
    item_id,
    if(item_param is null, '', item_param) as item_param,
    count(srv_serial) as recv_count,
    min(item_time) as item_min_time,
    max(item_time) as item_max_time,
    int(round(avg(item_value),2)) as item_value,
    NOW() as sdc_ins_time,
    if (avg(item_value) is null, last(item_value), null) as item_value_str
FROM vms_status
GROUP BY srv_name, srv_serial, groups_id, item_id, item_param
"""
df_3 = sql_context.sql(q).withColumn("history_serial", expr("uuid()").cast("String"))
df_3.show(truncate=False)

jdbc_write(df_3, "append", table_name)

Show(), the data free of the above code result df_3, the UTC time of the current time is set and output. (Spark session utc has been set up) But the mssqlinserted value was saved as the current local time, which is UTC+9 hours. Where should I synchronize? I need a UTC now value.

python mysql mssql

2022-09-20 19:40

1 Answers

I don't know if it's dusty, but most of them have problems because they don't set the DB server time.

Check which computer it is written based on, and check the query you wrote yourself. But if it's weird, why don't you look at the table now?

I think the default value is CURRENT_TIME, so you should just modify the DB time or modify the DB table.

P.S.

I've never played Spark before, so I'm not sure about Syntax.

There is a SELECT keyword in the query statement, is it right to write it like that? I'm not sure, but when DB is inserted, it usually uses the INSERT keyword to ask because it feels strange.


2022-09-20 19:40

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.