WaveFormat cannot be set for WASAPI Loopback Capture
See original GitHub issueI am trying to record from speaker (loopback recording) in sample rate-8000 and channel-1, problem is code is recording in sample rate-44100 and channel-2, format…
Stream stream = new System.IO.MemoryStream();
waveIn = new WasapiLoopbackCapture();
waveIn.DataAvailable += new EventHandler<NAudio.Wave.WaveInEventArgs>(sourceStream_DataRecord);
waveWriter = new NAudio.Wave.WaveFileWriter(stream, waveIn.WaveFormat);
waveIn.StartRecording();
when i add
waveIn.WaveFormat = new NAudio.Wave.WaveFormat(16000, 1);
it thrown exception
WaveFormat cannot be set for WASAPI Loopback Capture
there is any way to record it in 16000,1 format… actually i dont want it change after complete recording… i want it at start time… please help me on this
Issue Analytics
- State:
- Created 7 years ago
- Comments:20 (8 by maintainers)
Top Results From Across the Web
c# - NAudio Wasapi recording and conversion
WASAPI always records audio as IEEE floating point samples. So in the recorded buffer you get in the callback, every 4 bytes is...
Read more >WaveFormat cannot be set for WASAPI Loopback Capture
I am trying to record from speaker (loopback recording) in sample rate-8000 and channel-1, problem is code is recording in sample rate-44100 ...
Read more >WasapiLoopbackCapture.cs
/NAudio/Wave/WaveInputs/WasapiLoopbackCapture.cs ... WaveFormat; }; set { throw new InvalidOperationException("WaveFormat cannot be set for WASAPI Loopback ...
Read more >Persistent audio discontinuity in WASAPI loopback capture ...
I am writing a program that captures the output on a Windows device using WASAPI loopback capture. In principle it works correctly, ...
Read more >NAudio - RSSing.com
i'm trying to write a simple stereo-mix substitute in c#, but can't seem to get it working even remotely. public void StartRecording() {...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
you are confusing the length of buffers in bytes with the number of samples. In WaveInEventArgs you should use bytes recorded. Divide it by 4 to get number of samples, and then by 2 (assuming stereo) to get number of sample frames.
take a look at my article on input driven resampling, for one way to do this.