Detect JSON columns
See original GitHub issueWe should be able to automatically detect if a column contains JSON data and avoid escaping it in the output. Right now if I have the following table:
create table s003b.todos
(
id int not null constraint pk__s003b_todos primary key default (next value for s003b.globalId),
category_id varchar(3) collate Latin1_General_BIN2 not null constraint fk__s003b_todos__s003b_categories references s003b.categories(id),
title nvarchar(100) not null,
[description] nvarchar(1000) null,
tags nvarchar(max) null check ( isjson(tags)= 1 ),
completed bit not null default(0)
)
and I store JSON data in the “tags” column, I get the following output:
{ data": {
"todos": {
"items": [
{
"id": 10000,
"title": "item-001",
"completed": false,
"tags": "[{\"tag\":\"red\"}]",
"category": {
"id": "f",
"category": "Family"
}
}
where tags
contains encoded JSON…even if the content is valid JSON itself. Right now Azure SQL DB doesn’t have a native JSON data type, but we can check if a column contains JSON data by checking the check constraint that should have been created to allow only JSON data to be inserted (see table definition above). This query can return which columns should be treated as JSON:
select
s.[name] as [schema_name],
t.[name] as [table_name],
c.[name] as [columne_name],
ck.[definition],
case when (ck.[definition] like '%isjson(/[' + trim(c.[name]) + '/])=(1)%' escape '/') then 1 else 0 end as [isjson]
from
sys.check_constraints ck
inner join
sys.tables t on ck.[parent_object_id] = t.[object_id]
inner join
sys.columns c on t.[object_id] = c.[object_id] and ck.parent_column_id = c.[column_id]
inner join
sys.schemas s on t.[schema_id] = s.[schema_id]
Issue Analytics
- State:
- Created a year ago
- Comments:12 (12 by maintainers)
Top Results From Across the Web
Check if column exists in Nested JSON - java
Yes, it is possible to check using jsonObject.has() method. Suppose you have response variable in which you have full JSON.
Read more >Cleaning and Extracting JSON From Pandas DataFrames
The code returns a DataFrame that has over 29,000 columns and takes a long time to run. You may want to test on...
Read more >4 Creating a Table With a JSON Column
You can create a table that has JSON columns. You use SQL condition is json as a check constraint to ensure that data...
Read more >Validate, Query, and Change JSON Data with Built-in ...
The ISJSON function tests whether a string contains valid JSON. The following example returns rows in which the JSON column contains valid JSON...
Read more >Work with JSON data in SQL Server
Test drive built-in JSON support with the AdventureWorks sample database · Denormalize the existing schema to create columns of JSON data.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Because curiosity meant that I needed to dig into this more I decided to look at what we need to do to identify JSON columns.
PostgreSQL (and I expect MySQL but didn’t test) should be reasonably easy to detect as they have a column type of
json
that is returned when using theconn.GetSchema("Columns")
query:Azure SQL is a little harder as it isn’t a data type but it’s a constraint on the table that we can query for and then find the column that the constraint is applicable to. Here’s a SQL query that we can execute:
With this we could add another property to
ColumnDefinition
ofIsJson
(or similar) and use that as part of the codegen for the GraphQL engine.Not enough time to do this with competing deadlines in Jan203. I agree let’s do 1 since we need to support Synapse anyway and its unfortunate that ADO.NET doesn’t have that support yet.