question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Akka.NET shows much better performance benchmark then Process

See original GitHub issue

I ran the ping-pong benchmark in Akka.NET (http://getakka.net/) and, if the numbers can be believed, Akka.NET’s performance is orders of magnitude better then Process for message sending. Both Akka.net and Language-ext compiled in release mode with Any CPU.

Akka.net Ping Pong:

Worker threads:         2047
OSVersion:              Microsoft Windows NT 6.2.9200.0
ProcessorCount:         8
ClockSpeed:             2801 MHZ
Actor Count:            16
Messages sent/received: 30000000  (3e7)

ActorBase    first start time: 7.32 ms
ReceiveActor first start time: 25.93 ms

            ActorBase                          ReceiveActor
Throughput, Msgs/sec, Start [ms], Total [ms],  Msgs/sec, Start [ms], Total [ms]
         1,  4863000,       4.94,    6172.97,   4745000,       4.55,    6327.55
         5, 13495000,       3.52,    2226.95,  13026000,       3.35,    2306.68
        10, 16816000,       3.46,    1787.97,  16438000,       3.19,    1829.12
        15, 18170000,       5.60,    1656.70,  18126000,       4.12,    1659.36
        20, 20188000,       4.24,    1490.51,  18315000,       3.22,    1641.94
        30, 20229000,       3.35,    1487.07,  18484000,       3.19,    1626.61
        40, 17964000,       3.68,    1674.60,  18226000,       3.32,    1649.93
        50, 19659000,       5.00,    1531.84,  18856000,       3.42,    1595.30
        60, 19815000,       4.63,    1519.11,  19417000,       3.07,    1549.01
        70, 18844000,       3.52,    1596.22,  19828000,       3.36,    1517.10
        80, 19893000,       5.09,    1513.93,  18808000,       3.81,    1599.36
        90, 19023000,       4.78,    1581.97,  18427000,       3.54,    1631.95
       100, 19181000,       4.78,    1568.93,  18575000,       3.37,    1619.33
       200, 19933000,       3.46,    1509.04,  19243000,       3.16,    1563.03

language-ext Throughput:

Sleeping for 20 seconds whilst it warms up
Sleeping for 19 seconds whilst it warms up
Sleeping for 18 seconds whilst it warms up
Sleeping for 17 seconds whilst it warms up
Sleeping for 16 seconds whilst it warms up
Sleeping for 15 seconds whilst it warms up
Sleeping for 14 seconds whilst it warms up
Sleeping for 13 seconds whilst it warms up
Sleeping for 12 seconds whilst it warms up
Sleeping for 11 seconds whilst it warms up
Sleeping for 10 seconds whilst it warms up
Sleeping for 9 seconds whilst it warms up
Sleeping for 8 seconds whilst it warms up
Sleeping for 7 seconds whilst it warms up
Sleeping for 6 seconds whilst it warms up
Sleeping for 5 seconds whilst it warms up
Sleeping for 4 seconds whilst it warms up
Sleeping for 3 seconds whilst it warms up
Sleeping for 2 seconds whilst it warms up
Sleeping for 1 seconds whilst it warms up
Warm up sent 949384 messages. Running for real now...
917442 messages sent in 20 seconds.
That's 45872.1 messages per second

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:5 (4 by maintainers)

github_iconTop GitHub Comments

12reactions
louthycommented, Aug 18, 2016

@jcoder58 Just a follow up from this. I have done some initial optimisation (the very low hanging fruit), and have seen a speed up of a factor of 32.73x. It won’t be available for a while, but I thought you’d be interested to know that your example of 45,872 messages/second would be of the order of 1,501,390 messages/second.

I think this bodes well for future optimisations.

3reactions
louthycommented, Jul 15, 2016

@jcoder58 The Process library has so far been through zero optimisation passes, so it’s not surprising it’s currently lagging behind Akka. I have mostly (over the past year) been bringing features into the library as it’s ‘quick enough’ and didn’t want to prematurely optimise. I use it for a large project that will over the next 6 months or so probably need some more juice, so expect significant improvements, although Akka parity isn’t the priority - the quality of the API and system as a whole is more important for me;

This is personal opinion, so take it with a pinch of salt, but I am really no fan of the very OO/Java-esque style of Akka and that was the main reason I the Process library. I am also no fan of its untyped leanings; I have seen in the past the benefits to strongly typing the bridges between disparate systems. For example the Process library checks whether the actor you’re sending a message to can actually receive it, so you get errors at the point-of-use rather than via dead-letters (even when messaging remotely). Some of the additional work this library does will almost certainly mean it can never get to Akka speeds, but I don’t see any reason why it couldn’t get to 10x with some careful optimisation.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Akka.Remote Performance Optimization
If you're building a large-scale system using Akka.NET, the performance of Akka.Remote is going to be one of the most critical factors of...
Read more >
2022 Akka.NET Year-in-Review and Future Roadmap
The overall performance of Akka.NET v1.4.47's in-memory messaging is 8% faster for ActorBase , 3% faster for ReceiveActor . As for Akka.Remote - ......
Read more >
Akka.net vs Orleans performance
As @DanWilson already said, the benchmarks are based on a lot of factors so it is hard to really compare. While Akka .NET...
Read more >
Akka gRPC update delivers 1200% performance ...
Akka gRPC (named “Scala_Akka” in the benchmark) went from 29th place to 1st in just three months, delivering an impressive 1231% increase in ......
Read more >
Benchmark: .NET virtual actor frameworks
We compared the .NET virtual actor model frameworks functionally in our previous post, but what about their performance?
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found