question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Batching doesn't seem to work that well

See original GitHub issue

Step 1: Describe your environment

  • Windows 10
  • Visual Studio 2017
  • .NET Core 2.1

Step 2: Describe the problem

The Durable Http Batching doesn’t seem to respect the batch size.

Steps to reproduce:

  1. Create a .NET Core 2.1 Asp.Net web Api project, and replace the default ValuesController with this
   [Route("api/[controller]")]
    [ApiController]
    public class EventsController : ControllerBase
    {
        static List<int> _counts = new List<int>();

        // GET api/events
        [HttpGet]
        public IEnumerable<string> Get()
        {
            return _counts.Select(x=>x.ToString());
        }

        public IActionResult Post([FromBody] EventBatchRequestDto batch)
        {
            _counts.Add(batch.Events.Count());
            return Ok();
        }
    }

    public class EventBatchRequestDto
    {
        public IEnumerable<EventDto> Events { get; set; }
    }


    public class EventDto
    {
        public DateTime Timestamp { get; set; }
        public String Level { get; set; }
        public String MessageTemplate { get; set; }
        public String RenderedMessage { get; set; }
        public String Exception { get; set; }
        public Dictionary<String, dynamic> Properties { get; set; }
        public Dictionary<String, RenderingDto[]> Renderings { get; set; }
    }


    public class RenderingDto
    {
        public String Format { get; set; }
        public String Rendering { get; set; }
        public override Boolean Equals(Object obj)
        {
            if (!(obj is RenderingDto other))
                return false;

            return
                Format == other.Format &&
                Rendering == other.Rendering;
        }

        public override Int32 GetHashCode()
        {
            return 0;
        }
    }
  1. Create a new .NET Core 2.1 Console app, and make sure you have these Nugets
<PackageReference Include="Bogus" Version="24.1.0" />
<PackageReference Include="Serilog.Sinks.Console" Version="3.1.1" />
<PackageReference Include="Serilog.Sinks.Http" Version="5.0.1" />

Where this is the Console apps Program.cs class MAKE SURE YOU CHANGE THE HTTP ENDPOINT TO YOUR OWN ONE

using System;
using System.Net.Http;
using System.Threading;
using Serilog.Sinks.Http.BatchFormatters;

namespace Serilog.Http.Tester
{
    class Program
    {
        static void Main(string[] args)
        {
            Random rand = new Random(5000);

            ILogger logger = new LoggerConfiguration()
                .MinimumLevel.Verbose()
                .WriteTo.DurableHttp(
                    requestUri: "http://localhost:52603/api/events",
                    batchPostingLimit:10,
                    batchFormatter: new DefaultBatchFormatter(),
                    httpClient: new SerilogHttpSinkHttpClientWrapper(new HttpClient(new HttpClientHandler
                        {
                            ClientCertificateOptions = ClientCertificateOption.Manual,
                            ServerCertificateCustomValidationCallback = (_, __, ___, ____) => true
                        }),
                        true)
                )
                .WriteTo.Console()
                .CreateLogger()
                .ForContext<Program>();

            var customerGenerator = new CustomerGenerator();
            var orderGenerator = new OrderGenerator();
            int i = 0;
            while (true)
            {
                var customer = customerGenerator.Generate();
                var order = orderGenerator.Generate();

                logger.Information("{@customer} placed {@order}", customer, order);
                i++;
                Console.WriteLine($"Sent {i} events");
                Thread.Sleep(rand.Next(0,1000));
            }

        }
    }
}
  1. Run both of these together in Visual Studio

Observed results

So leave it running for a while, then in chrome hit the GET endpoint Uri (which I am using just see the batch size that was seen in the previous POST enpoint calls from this Serilog sink), so for me this is http:localhost:52603/api/events

Expected results

I expected to see that the messages were batched according to the batchPostingLimit that I have in the Console App code above. Which is 10.

But instead I see output like this, this is with me sending 29 log messages, which are randomly sent to serilog sink between 0-1s delay between

image

Is there some Windowing feature that is at play with the batchPostingLimit ?

Even if I leave it off entirely where the default should be 1000, I get these sorts of results

image

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
FantasticFiascocommented, Sep 27, 2018

I’ve modified your first version of the controller to look something like this:

[Route("api/[controller]")]
[ApiController]
public class EventsController : ControllerBase
{
    static List<string> _batches = new List<string>();

    [HttpGet]
    public IEnumerable<string> Get()
    {
        return _batches;
    }

    public IActionResult Post([FromBody] EventBatchRequestDto batch)
    {
        _batches.Add($"{DateTime.Now:HH:mm:ss.ff} - {batch.Events.Count()} events");
        return Ok();
    }
}

And the console application looks like this:

static void Main(string[] args)
{
    Random rand = new Random(5000);

    ILogger logger = new LoggerConfiguration()
        .MinimumLevel.Verbose()
        .WriteTo.DurableHttp(
            requestUri: "http://localhost:52603/api/events",
            batchPostingLimit: 30,
            batchFormatter: new DefaultBatchFormatter(),
            httpClient: new SerilogHttpSinkHttpClientWrapper(new HttpClient(new HttpClientHandler
                {
                    ClientCertificateOptions = ClientCertificateOption.Manual,
                    ServerCertificateCustomValidationCallback = (_, __, ___, ____) => true
                }),
                true),
            period: TimeSpan.FromSeconds(2)
        )
        .WriteTo.Console()
        .CreateLogger()
        .ForContext<Program>();

    var customerGenerator = new CustomerGenerator();
    var orderGenerator = new OrderGenerator();

    while (true)
    {
        Thread.Sleep(3000);

        for (int i = 0; i < 100; i++)
        {
            var customer = customerGenerator.Generate();
            var order = orderGenerator.Generate();

            logger.Information("{@customer} placed {@order}", customer, order);
        }
    }
}

The response from the route http://localhost:52603/api/events would look something like this.

[
  "22:44:21.27 - 30 events",
  "22:44:21.33 - 30 events",
  "22:44:21.37 - 30 events",
  "22:44:21.41 - 10 events",
  "22:44:23.44 - 30 events",
  "22:44:23.47 - 30 events",
  "22:44:23.49 - 30 events",
  "22:44:23.53 - 10 events",
  "22:44:27.56 - 30 events",
  "22:44:27.59 - 30 events",
  "22:44:27.60 - 30 events",
  "22:44:27.62 - 10 events"
]

Please note that I am setting the period of the sink to 2 seconds. This means that the sink no more often than every other second investigates whether any log events have been written to disk, awaiting to be posted over the network. If any log events are found, they are batched up according to batchPostingLimit , i.e. maximum 30 log events per HTTP request. For 100 log events, that means batches of 30, 30, 30 and finally 10 log event per HTTP request, fired in quick succession after each other. When the sink has sent all batches, given no new log events have been written to disk, the log event shipper goes back to sleep and resumes it responsibility after given period.

Perhaps is batchPostingLimit in need of clarification in the documentation? It is not describing the size of a buffer that when getting full is being flushed. It is instead a value describing the maximum number of log events that a single HTTP request can contain. It is ment to be a way for the log event producer to limit the potential size of HTTP packages being sent over the network, if for some reason the receiver has limits regarding package sizes.

Did this bring any clarity to your issue, or can I help you in any other way?

1reaction
sachabarbercommented, Sep 27, 2018

I think this is the most detailed issue I’ve ever received

Sorry about that

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why does batching not work?
Unity first orders GameObjects in this order, and then tries to batch them, but because the order must be strictly satisfied, this often...
Read more >
When Bad Task Batching Leads to Burnout (and How to ...
Learn why batching tasks isn't a productivity cure-all and why without energy management, bad batching leads to bad burnout. (Includes bonus video)
Read more >
8 ways why batching is bad for your business
1) Delays in detecting problems – The parts are not allowed to move to the next process until the whole batch is complete,...
Read more >
Batching doesn't seem to do anything - Help & Support
Make sure your blocks use the same materials for the batching to work. If each block has unique material, there will be nothing...
Read more >
4 Productivity Tips for When Batching Tasks Doesn't Work For ...
Now you may be wondering how this is a productivity tip. Stop and think about what batching means. You're simply completing similar tasks....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found