question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ExecuteStoredProcedureStreamAsync is only half-Stream

See original GitHub issue

tldr: I would like an overload of ExecuteStoredProcedureStreamAsync which takes a Stream as a parameter.

Scripts.ExecuteStoredProcedureStreamAsync returns a ResponseMessage (aka Stream) so we can handle it however we wish. Streams are handy because we can control serialization (and other reasons).

However, it takes dynamic[] as a parameter representing the parameters to the stored procedure. In ScriptsCore, this is converted to a stream using this.clientContext.CosmosSerializer.ToStream<dynamic[]>(parameters);. In other words, it uses CosmosSerializer.

While I suppose this is fine, I would like more control over when things get serialized.

From a higher-level point of view, I would prefer that my entire experience with Cosmos is Stream-based, and that when using Stream methods, there’s “no such thing” as serialization. In other words, Stream-based methods should never serialize nor deserialize “user” objects (but can still serialize internal structures, eg: change feed [documents:[…]]).

Challenges with adding an overload to ExecuteStoredProcedureStreamAsync:

  • The system expects an array but the user passes something else, and as a stream, we can’t tell the difference until the server probably gives us a BadRequest response.
  • dynamic[] parameters and Stream streamPayload cannot be in the same position because null can be passed in and this would be a breaking change

Mitigations:

If the user serializes something that’s not an array, just allow it and let the server blow up with a BadRequest. No big deal.

As to the parameter positions: Option 1: Change the positions. It seems “awkward” but we can do string storedProcedureId, Stream streamPayload, PartitionKey partitionKey which is safely differentiated from string storedProcedureId, PartitionKey partitionKey, dynamic[] parameters Option 2: Change the name. But I can’t think of a non-awkward alternative name for ExecuteStoredProcedureStreamAsync.

I’ll post a PR for this. But I realize I might be stepping on some design-toes so feel free to consider the PR as just a further explanation to this issue.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:8 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
kirankumarkollicommented, Dec 2, 2019

Stream API’s makes fully sense if the required payload comes to application layer (WebAPI etc…) already as stream. If the application is preparing the stream then serialization is happening at-least once on client (with SDK or application).

We are not against having a stream input API for SPROC. But best is to address #1057.

/cc: @ealsur, @abhijitpai

0reactions
joshlangcommented, Dec 5, 2019

@abhijitpai Transactions.

The bounds are the 2MB payload limit.

What I’d like is predictability in transactions. It’s one of the reasons I was so excited to see TransactionalBatch and why I was so disappointed to run into #1057 the 100 document limit.

I suppose that 100 documents is… well, predictable.

The way I see it is that stored procedures have an unpredictable ability to do work before saying “Welp, that’s all for you this time!”. But whatever that limit is, it seemed to handle whatever we threw at it within the 2MB payload limit without any problems.

My perception when I saw TransactionalBatch was that now there was predictable functionality and as long as I stayed within the 2MB payload limit, the server would accept it.

In many ways, 100 documents is more predictable than 2MB. Maybe it’s the right way to go (if you need limits). Selfishly, I’d prefer that only the 2MB limit existed. Even more selfishly, I’d prefer that there were no limits at all. But knowing the feasibility of this in your back-end is beyond me.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Is it possible to get only partial stored procedure results on ...
Yes, it is possible. Results are 'created' as the query executes, and send back to the client as they are being created.
Read more >
Streaming Data Into SQL Server 2008 From an Application
The "half-streaming" (option = half) method of reading all of the data into memory (same as the "standard TVP" method) but then calling...
Read more >
sp_execute_external_script (Transact-SQL) - SQL Server
The sp_execute_external_script stored procedure executes a script provided as an input argument to the procedure, and is used with Machine ...
Read more >
How to Pass a List of Values Into a Stored Procedure
Say we have a stored procedure that queries the Stack Overflow Users table to find people in a given location. Here's what the...
Read more >
How To Use Stored Procedures in MySQL
Stored procedures are executed directly on the database server, performing all computations locally and returning results to the calling user ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found