PublishEventAsync results in Grpc.Core.RpcException (Connection refused) even when sidecar running (v1.5.1)
See original GitHub issueExpected Behavior
Using the DAPR .NET SDK client PublishEventAsync method should work as expected.
Background
DAPR sidecar is started just once with these arguments:
dapr run --app-id order-service --components-path /home/simon/dapr-workshop/components/local --app-port 5100 --dapr-grpc-port 5101 --dapr-http-port 5180
Pubsub component config is
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: dapr-workshop.pubsub
spec:
type: pubsub.redis
metadata:
- name: redisHost
value: 127.0.0.1:6379
- name: redisPassword
value: ""
This code results in an Exception and no message publication.
using var daprClient = new DaprClientBuilder().Build();
await daprClient.PublishEventAsync("dapr-workshop.pubsub","orders",orderSummary);
In the exact same cs file I can implement the above as follows and no error occur and my message is published.
var port = Environment.GetEnvironmentVariable("DAPR_HTTP_PORT");
var pubsubName= "dapr-workshop.pubsub";
var topic = "orders";
using (var httpClient = new HttpClient())
{
var result = await httpClient.PostAsync(
$"http://127.0.0.1:{port}/v1.0/publish/{pubsubName}/{topic}",
new StringContent(JsonSerializer.Serialize(orderSummary), Encoding.UTF8, "application/json")
);
_logger.LogInformation($"Order with id {orderSummary.OrderId} published with status {result.StatusCode}!");
}
I can also do a direct HTTP POST with a JSON payload to the DAPR publish endpoint and the message is sent as expected.
Actual Behavior
When the DAPR .NET SDK client method is used the result is an exception as follows.
[2021-12-30 01:29:52.451 ERR] [Serilog.AspNetCore.RequestLoggingMiddleware] HTTP POST /order responded 500 in 2514.4281 ms
Dapr.DaprException: Publish operation failed: the Dapr endpoint indicated a failure. See InnerException for details.
---> Grpc.Core.RpcException: Status(StatusCode="Internal", Detail="Error starting gRPC call. HttpRequestException: Connection refused SocketException: Connection refused", DebugException="System.Net.Http.HttpRequestException: Connection refused
---> System.Net.Sockets.SocketException (111): Connection refused
at System.Net.Http.ConnectHelper.ConnectAsync(String host, Int32 port, CancellationToken cancellationToken)
--- End of inner exception stack trace ---
at System.Net.Http.ConnectHelper.ConnectAsync(String host, Int32 port, CancellationToken cancellationToken)
at System.Net.Http.HttpConnectionPool.ConnectAsync(HttpRequestMessage request, Boolean allowHttp2, CancellationToken cancellationToken)
at System.Net.Http.HttpConnectionPool.GetHttp2ConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)
at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)
at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
at System.Net.Http.DiagnosticsHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
at Grpc.Net.Client.Internal.GrpcCall`2.RunCall(HttpRequestMessage request, Nullable`1 timeout)")
at Dapr.Client.DaprClientGrpc.MakePublishRequest(String pubsubName, String topicName, ByteString content, Dictionary`2 metadata, CancellationToken cancellationToken)
--- End of inner exception stack trace ---
at Dapr.Client.DaprClientGrpc.MakePublishRequest(String pubsubName, String topicName, ByteString content, Dictionary`2 metadata, CancellationToken cancellationToken)
at Vigilantes.DaprWorkshop.OrderService.Controllers.OrderController.NewOrder(CustomerOrder order) in /home/simon/dapr-workshop/Vigilantes.DaprWorkshop.OrderService/csharp/Controllers/OrderController.cs:line 57
at Microsoft.AspNetCore.Mvc.Infrastructure.ActionMethodExecutor.TaskOfIActionResultExecutor.Execute(IActionResultTypeMapper mapper, ObjectMethodExecutor executor, Object controller, Object[] arguments)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<InvokeActionMethodAsync>g__Awaited|12_0(ControllerActionInvoker invoker, ValueTask`1 actionResultValueTask)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<InvokeNextActionFilterAsync>g__Awaited|10_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Rethrow(ActionExecutedContextSealed context)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<InvokeInnerFilterAsync>g__Awaited|13_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<InvokeFilterPipelineAsync>g__Awaited|19_0(ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<InvokeAsync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)
at Microsoft.AspNetCore.Routing.EndpointMiddleware.<Invoke>g__AwaitRequestTask|6_0(Endpoint endpoint, Task requestTask, ILogger logger)
at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)
at Serilog.AspNetCore.RequestLoggingMiddleware.Invoke(HttpContext httpContext)
Steps to Reproduce the Problem
Running on WSL with Windows 11. WSL is running Ubuntu 20.04 and has Docker CE running.
DAPR was installed using dapr init
and reports itself as v1.5.1
DAPR .NET SDK version is 1.5.0.
.NET SDK version is 3.1.416
Release Note
RELEASE NOTE:
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:8 (4 by maintainers)
Top GitHub Comments
Thank you.
I think it needs to be mentioned in a more prominent place, not just in the reference. It is too easy to miss it and to make this mistake.
Sorry for asking in a closed issue @halspang @rynowak - could you please point me to the documentation of this behavior? We just ran into it, as well 😉
Thanks!