question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Can I have a small sample of sending 40k messages/second?

See original GitHub issue

I have a simple ASP.NET Core API project, hosted as a windows service. I’ve tried to optimized anything I can (know), but sending 1000 requests (each request publish 20 messages via nats client) isn’t as fast as expectation.

Here’s a sample of what’s I’m doing:

    [Route("api/[controller]")]
    [ApiController]
    public class ValuesController : ControllerBase
    {
        private IConnection _connection;

        public ValuesController(IConnection connection)
        {
            _connection = connection;
        }

        [HttpPost("notification")]
        public ActionResult PublishMessage(IEnumerable<Notification> payload)
        {
            if (payload == null)
                return BadRequest("Empty payload");

            Task.Run(() =>
            {
                var notificationBatches = payload.Batch(5, true);
                foreach (var notifications in notificationBatches)
                {
                    Parallel.ForEach(notifications, (notification, state) =>
                        {
                            _connection.Publish(
                                Encoding.UTF8.GetBytes(System.Text.Json.JsonSerializer.Serialize(notification)));
                        });
                }
            });
            return Ok();
        }

        public class Notification
        {
        }
    }

    public static class Ext
    {
        /// <see cref="https://stackoverflow.com/a/11775295"/>
        public static IEnumerable<IEnumerable<T>> Batch<T>(
            this IEnumerable<T> source,
            int size,
            bool buffered = false,
            CancellationToken cancellationToken = default(CancellationToken))
        {
            if (source == null) throw new ArgumentNullException(nameof(source));

            if (size <= 0) throw new ArgumentException("Size must great than 0", nameof(size));

            if (buffered == false)
                using (var enumerator = source.GetEnumerator())
                {
                    IEnumerable<T> GetPart<T>(IEnumerator<T> enumerator_, int size_)
                    {
                        do
                        {
                            yield return enumerator_.Current;
                        } while (--size_ > 0 && enumerator_.MoveNext());
                    }

                    while (enumerator.MoveNext())
                    {
                        yield return GetPart(enumerator, size);
                    }
                }
            else
            {
                var batch = new List<T>(size);
                foreach (var item in source)
                {
                    batch.Add(item);
                    if (batch.Count == size)
                    {
                        yield return batch;
                        batch = new List<T>(size);
                    }
                }

                if (batch.Count > 0)
                    yield return batch;
            }
        }
    }

Then I create a small windows app that make requests. I tried: 1000 requests, each requests with 30 Notifications Running that test, give some response with high elapsed time (up to 2000ms)

I think the problem is with the connection, when I try above code with out publishing message (just make a task.delay(500)), the highest elapsed time is 600ms

My target: create an asp .net core webapi, that can handle up to 1000 requests, each request has up to 40 messages payload

Please help me.

Duong Nguyen

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:10 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
watfordgnfcommented, Jul 6, 2020

Your response times are not apples to apples in these cases. In your Task.Run example the created task is not awaited and the Ok response is returned as soon as the thread is created. In @ColinSullivan1’s example code the Ok response is returned as soon as all of the notifications are sent. You’re comparing the time it takes to spawn a task with the time it takes to send all of the messages.

In any case, Publish is the lowest overhead call in the C# NATS client, so calling it in a loop should be the best performance for a single connection. I’d pull the code out into a console application to benchmark the timing for serialization and conversion to UTF-8 byte arrays as that overhead is non-trivial compared to the time to send a NATS message.

Two notes:

  1. I would not Publish in parallel using the same connection instance, you’ll slow yourself down with per-connection locking. You could use Publish in parallel using different connection instances, as they won’t contend for the same locks.
  2. If you want to merely return upon “accepting” the messages, but not publishing, you should have a background service that handles this work. Each request would simply enqueue the messages for the background service.

Given IMessageQueue which is a singleton:

    [HttpPost("notification")]
    public ActionResult PublishMessage(
        [FromServices] IMessageQueue messageQueue,
        [FromBody] IEnumerable<Notification> payload)
    {
        if (payload == null)
            return BadRequest("Empty payload");
        
        foreach (var notification in payload)
        {
            await messageQueue.EnqueueAsync(CreateMessage(notification));
        }

        return Ok();
    }

With a background service like:

public class MessageSenderBackgroundService : BackgroundService
{
    private readonly IMessageQueue _messageQueue;

    public MessageSenderBackgroundService(IMessageQueue messageQueue)
    {
        _messageQueue = messageQueue;
    }

    protected override async Task ExecuteAsync(CancellationToken stoppingToken)
    {
        using (var connection = /* create or inject your NATS connection */)
        {
            while (!token.IsCancellationRequested)
            {
                Message message = await _messageQueue.DequeueAsync(stoppingToken);
                connection.Publish(message.Subject, message.Payload);
                // pick some heuristic to flush if the default isn't reasonable
            }
        }
    }
}
0reactions
haiduong87commented, Jul 8, 2020

@watfordgnf @ColinSullivan1 I’ve uploaded my latest working here: https://github.com/haiduong87/NotificationServer

Please help me to check my using of NatsClient.

Thank you!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Strategies for Using Microsoft's Attack Simulation Training
First, make sure that the fake phishing messages are actually being delivered to the recipient's Inbox and that none of your filtering ...
Read more >
How to use Azure functions to process high throughput ...
In this blog post, we will look at a specific messaging scenario. Suppose we have a large number of messages pushed to a...
Read more >
Confluent, MQTT, and Apache Kafka Power Real-Time IoT ...
As you can see, MQTT and Kafka work together perfectly. ... You can have a smaller deployment closer to the devices, such as...
Read more >
Extreme-scale viability of collective communication for ...
This provides lower latency since only one trip is required, but only works for small messages. Second, the sender can deliver a small...
Read more >
EFFICIENT SHUFFLE FOR FLASH BURST COMPUTING A ...
For example, if a server broadcasts data to 1000 other servers by sending a separate small message to each of them, the broadcast...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found