question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Issue Inserting Timestamp Type Field

See original GitHub issue

I’m trying to insert a timestamp-type field into BigQuery. The table is already set up and the schema set.

I’ve tried simply inserting the bigint/long field into the timestamp field.

I’ve also tried converting from a date string:

    "transforms": "ConvertTimestamp",
    "transforms.ConvertTimestamp.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
    "transforms.ConvertTimestamp.field": "DATESTAMP",
    "transforms.ConvertTimestamp.target.type": "Timestamp",
    "transforms.ConvertTimestamp.format": "yyyy-MM-dd'T'HH:mm:ssXXX"

But I always get errors like the following:

[row index 404]: invalid: Timestamp field value is out of range:1574731800000000000

That value is longer than the one stored in the AVRO row. Am I doing something wrong?

I’m using version 1.3.0

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:7

github_iconTop GitHub Comments

2reactions
mortenberg80commented, Nov 30, 2019

We got the same problem when we tried to store the timestamp as a long, and we fixed it by changing the type in the Avro schema:

 {
    "name": "timestamp",
    "type": {
      "type": "long",
      "connect.name": "org.apache.kafka.connect.data.Timestamp"
    },
    "doc": "Timestamp in millis since epoch UTC"
  }

We had to add the "connect.name": "org.apache.kafka.connect.data.Timestamp". We just happened to stumble upon this from the tests in this project.

Does anyone know if there exists any documentation describing the Avro schema types, and their compatibility with BQ schemas?

0reactions
archy-boldcommented, Dec 3, 2019

And re the Attempted to reduce batch size below 1. error. The issue was in the timestamp-field partitioning. It was a bug in the PR in how it inserted rows into the table. Info here: https://github.com/wepay/kafka-connect-bigquery/pull/214#issuecomment-561337346

Read more comments on GitHub >

github_iconTop Results From Across the Web

Error inserting a value into a 'timestamp' column - Support
This is a feature of SQL Server. Data Logger will be unable to insert a timestamp if the database archetecture requires a the...
Read more >
Inserting date and Timestamp giving error - Stack Overflow
CURRENT_TIMESTAMP returns the current date and time in the session time zone, in a value of datatype TIMESTAMP WITH TIME ZONE .
Read more >
datatype timestamp(6) issue to insert - Oracle Communities
Hi All, I have one column=LAST_UPDATE_DATE of datatype TIMESTAMP(6) when I try to insert below data it is showing me error and not...
Read more >
Capturing INSERT Timestamp in Table SQL Server
Capture the timestamp of the inserted rows in the table with DEFAULT constraint in SQL Server. · Syntax: · Let's create a table...
Read more >
How to Quickly Insert Date and Timestamp in Excel
In the Type field, enter dd-mm-yyyy hh:mm:ss insert-date-and-timestamp-in-excel-custom-format; Click OK. This would ensure that the result shows the date as ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found