Possibility to specify custom schema examples.
See original GitHub issueHere’s my situation:
- We are building a a REST API and we want to document it with SpringDoc/Swagger
- Some of the APIs return
ProductDto
which is defined in a jar file and we have no access to the codebase ofProductDto
. - The
ProductDto
has a field of typeMonetaryAmount
and we have registered a jackson module to deal with serialization/de-serialization ofMonetaryAmount
. So when the API is invoked we see this in the response:
"price": {
"amount": "5.00",
"currency": "USD"
},
however, the Swagger-UI and the generated schema shows this as an example:
"price": {
"code": "string",
"defaultPaymentInfo": true,
"subscriptionCode": "string",
"saved": true,
"paymentMode": {
"id": "string",
"code": "string",
"name": "string",
"description": "string",
"method": {},
"hopChargeUrl": "string",
"cost": {
"negativeOrZero": true,
"positiveOrZero": true,
"negative": true,
"zero": true,
"positive": true,
"context": {
"precision": 0,
"fixedScale": true,
"maxScale": 0,
"empty": true,
"providerName": "string"
},
"currency": {
"defaultFractionDigits": 0,
"numericCode": 0,
"currencyCode": "string",
"context": {
"empty": true,
"providerName": "string"
}
},
"number": {
"amountFractionNumerator": 0,
"amountFractionDenominator": 0,
"precision": 0,
"scale": 0
}
}
},
Is there a possibility to register something like SchemaCustomizer
to provide custom snippet for fields of type MonetaryAmount
?
We can’t add annotations because we don’t have access to the source-code and my wish was to actually deal with this in a central place, rather than going through all of our DTOs and adding custom annotations every time I see a field of type MonetaryAmount
.
Issue Analytics
- State:
- Created 4 years ago
- Comments:8 (7 by maintainers)
Top Results From Across the Web
Spark Schema - Explained with Examples
Spark SQL provides StructType & StructField classes to programmatically specify the schema. By default, Spark infers the schema from the data, however, ...
Read more >Adding Custom Schema to Spark Dataframe - Analyticshut
We will learn how to specify our custom schema with column names and data types for Spark data frames.
Read more >Creating a schema for a custom object
Understanding schemas in the Custom Objects API. A schema is a template for creating and validating object records of a certain type. The...
Read more >Specifying a schema | BigQuery - Google Cloud
Manually specifying schemas · Option 1: Use Add field and specify each field's name, type, and mode. · Option 2: Click Edit as...
Read more >Do you use custom schemas? - SQLServerCentral
This is the default schema in SQL server. I would guess many of you use custom schemas with your systems, but perhaps some...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hello,
If we suppose, you are using this custom class for Schema mapping:
You can then call:
@AnthonyGress,
it’s all documented here:
And for concrete java samples, search for
SpringDocUtils
in this repo.