BaluMediaServer.CameraStreamer
1.5.16
dotnet add package BaluMediaServer.CameraStreamer --version 1.5.16
NuGet\Install-Package BaluMediaServer.CameraStreamer -Version 1.5.16
<PackageReference Include="BaluMediaServer.CameraStreamer" Version="1.5.16" />
<PackageVersion Include="BaluMediaServer.CameraStreamer" Version="1.5.16" />
<PackageReference Include="BaluMediaServer.CameraStreamer" />
paket add BaluMediaServer.CameraStreamer --version 1.5.16
#r "nuget: BaluMediaServer.CameraStreamer, 1.5.16"
#:package BaluMediaServer.CameraStreamer@1.5.16
#addin nuget:?package=BaluMediaServer.CameraStreamer&version=1.5.16
#tool nuget:?package=BaluMediaServer.CameraStreamer&version=1.5.16
π‘ Balu Media Server - MAUI RTSP Server for Android
A powerful, lightweight, and easy-to-integrate RTSP server library for .NET MAUI on Android. Stream live camera feeds with MJPEG and H.264 codecs, featuring both RTSP and HTTP streaming capabilities.
π Project Motivation
This project was born from a real need: I wanted to run an RTSP server on a custom Android device. However, since I'm not a fan of Java/Kotlin and currently focusing on C# for my specialization, I chose to develop this using .NET MAUI, a C# cross-platform framework that I love.
I quickly discovered a lack of existing libraries for RTSP streaming on Android in the C# ecosystemβespecially for mobile devices. So I decided to mix both worlds: use Kotlin for low-level Android camera access, and C# for everything else.
π― Purpose and Vision
The aim is to offer a simple, easily integrable, and lightweight RTSP server for Android using MAUI. It supports raw camera frame capture and streaming over RTSP using MJPEG and H.264 codecs. I hope this helps other developers avoid the struggles I faced, and have a better, cleaner entry point into mobile RTSP streaming using MAUI and C#.
MIT licensed. Free for everyone. No strings attached.
π¦ What's Inside?
πΉ Kotlin AAR Module
- Handles low-level camera access using Android's native APIs
- Designed to allow streaming from front, back, or both cameras
- Delivers frames in YUV_420 format via two callbacks
- Can work without a display (headless mode)
πΉ MAUI Integration
- Uses a .NET MAUI Library to integrate with the AAR
- Provides two camera services:
FrontCameraServiceandBackCameraService - Real-time frame capture at resolutions from 320x240 up to 4K UHD (3840x2160) (device dependent)
- Automatic encoder resolution validation with graceful fallback
- Dynamic memory-optimized buffer management for high resolutions
- Default frame rate: 45 FPS (adjusts dynamically)
πΉ RTSP Server (Pure C#)
- Full RTSP Protocol Compliance: Follows RTSP, RTP, and RTCP specifications
- Dual Codec Support:
- MJPEG: Works smoothly, high bandwidth, no compression
- H.264: Hardware-accelerated encoding, optimized for MediaTek devices
- High Concurrency: Can handle at least 12 simultaneous clients (tested)
- Authentication: Digest authentication included for basic security
- Transport Modes: UDP and TCP interleaved support
- Dynamic Bitrate: Automatic adjustment based on network conditions
- Multiple Profiles: Support for
/live/frontand/live/backroutes - Robust Client Lifecycle: Graduated error counting, timeout protection, and race-free cleanup
- Cross-SoC Compatibility: Wall-clock RTP timestamps and MediaTek-safe encoder configuration
πΉ MJPEG HTTP Server
- Simple, independent MJPEG server for easy HTML display
- Allows usage of
<img src="http://your-device:port/mjpeg" />in web pages - Built to avoid duplicate frame processingβshares frames with the RTSP stream
- Dual camera support with separate endpoints
πΉ Utility Features
- Start/stop MJPEG or camera services via EventBus
- Snapshot capability using callbacks
- Built-in foreground service for background compatibility
- Simple demo project included
- Callbacks available to monitor connected clients, stream status, etc.
- Advanced watchdog system with 60-second inactivity timeout and comprehensive health monitoring
- Automatic resource cleanup and memory management
πΉ Modular RTSP Architecture (v1.5.8+)
The RTSP server has been refactored into focused, testable modules for better maintainability:
RTSP/
βββ Server.cs # Main composition root (~700 lines)
βββ Protocol/
β βββ RtspProtocolHandler.cs # RTSP request parsing and response handling
β βββ SdpGenerator.cs # SDP generation for H.264/MJPEG
βββ Transport/
β βββ TransportManager.cs # UDP/TCP sending, port management
β βββ RtpPacketBuilder.cs # RTP packet creation and NAL fragmentation
β βββ RtcpManager.cs # RTCP sender reports and receiver feedback
βββ Streaming/
β βββ StreamingController.cs # Main streaming orchestration with timeout protection
β βββ H264EncoderManager.cs # H.264 encoder lifecycle management
β βββ JpegEncoderService.cs # Shared JPEG encoding for MJPEG clients
β βββ FramePacer.cs # Frame delivery timing and burst throttling
βββ Security/
β βββ AuthenticationManager.cs # Digest/Basic authentication
βββ ClientManagement/
βββ ClientManager.cs # Client lifecycle and cleanup
Benefits:
- Single Responsibility: Each module handles one specific concern
- Testability: Interfaces enable dependency injection and unit testing
- Maintainability: Smaller files (~150-350 lines) are easier to navigate
- Error Tracking: Stack traces point to specific modules
- Extensibility: Modules can be extended or replaced independently
π οΈ Installation
Prerequisites
- .NET 9.0 or later (partial support on .NET 8.0)
- Android SDK API Level 26+ (Android 8.0+)
- Visual Studio 2022 with MAUI workload
- Android device or emulator
NuGet Package
<PackageReference Include="BaluMediaServer.CameraStreamer" Version="1.5.16" />
Manual Installation
- Clone this repository
- Add the project reference to your MAUI application
- Add required permissions to your
AndroidManifest.xml
π Required Permissions
Add these permissions to your Platforms/Android/AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_DATA_SYNC" />
<uses-permission android:name="android.permission.POST_NOTIFICATIONS" />
π Quick Start
Basic RTSP Server Setup
using BaluMediaServer.Services;
using BaluMediaServer.Models;
public class MainPage : ContentPage
{
private Server _rtspServer;
public MainPage()
{
InitializeComponent();
// Request camera permissions
RequestPermissions();
// Initialize RTSP server with required authentication
_rtspServer = new Server(
Port: 7778, // RTSP port
MaxClients: 12, // Maximum concurrent clients
Address: "0.0.0.0", // Bind address
Users: new Dictionary<string, string>
{
{ "admin", "password123" } // Authentication is required
}
);
// Start the server
bool started = _rtspServer.Start();
if (started)
{
Console.WriteLine("RTSP Server started successfully!");
Console.WriteLine($"Front camera: rtsp://your-ip:7778/live/front");
Console.WriteLine($"Back camera: rtsp://your-ip:7778/live/back");
}
}
private async void RequestPermissions()
{
await Permissions.RequestAsync<Permissions.Camera>();
}
protected override void OnDisappearing()
{
_rtspServer?.Stop();
base.OnDisappearing();
}
}
MJPEG HTTP Server Setup
using BaluMediaServer.Services;
using BaluMediaServer.RTSP;
public class StreamingPage : ContentPage
{
private MjpegServer _mjpegServer;
public StreamingPage()
{
InitializeComponent();
// Initialize MJPEG server
_mjpegServer = new MjpegServer(port: 8089);
// Start MJPEG streaming
_mjpegServer.Start();
// Or use EventBus for decoupled control
EventBuss.SendCommand(BussCommand.START_MJPEG_SERVER);
}
protected override void OnDisappearing()
{
_mjpegServer?.Stop();
EventBuss.SendCommand(BussCommand.STOP_MJPEG_SERVER);
base.OnDisappearing();
}
}
Using with .NET MAUI
using BaluMediaServer.Services;
using BaluMediaServer.RTSP;
using BaluMediaServer.Models;
public partial class MainPage : ContentPage
{
private Server _rtspServer;
private MjpegServer _mjpegServer;
private bool _isStreaming = false;
private int _clientCount = 0;
public MainPage()
{
InitializeComponent();
InitializeServers();
}
private async void InitializeServers()
{
// Request camera permissions
await Permissions.RequestAsync<Permissions.Camera>();
// Setup authentication (required)
var users = new Dictionary<string, string>
{
{ "admin", "password123" },
{ "viewer", "readonly" }
};
// Initialize servers
_rtspServer = new Server(
Port: 7778,
MaxClients: 12,
Address: "0.0.0.0",
Users: users // Authentication is required
);
_mjpegServer = new MjpegServer(port: 8089);
// Subscribe to events
Server.OnStreaming += OnStreamingStateChanged;
Server.OnClientsChange += OnClientsChanged;
// Start RTSP server
bool started = _rtspServer.Start();
if (started)
{
var localIP = GetLocalIPAddress();
DisplayAlert("Server Started",
$"RTSP Server running at:\n" +
$"rtsp://{localIP}:7778/live/back\n" +
$"rtsp://{localIP}:7778/live/front", "OK");
}
}
private void OnStreamingStateChanged(object? sender, bool isStreaming)
{
_isStreaming = isStreaming;
MainThread.BeginInvokeOnMainThread(() => {
// Update UI
StatusLabel.Text = isStreaming ? "π΄ Live" : "β« Offline";
});
}
private void OnClientsChanged(List<Client> clients)
{
_clientCount = clients.Count(c => c.Socket?.Connected ?? false);
MainThread.BeginInvokeOnMainThread(() => {
ClientCountLabel.Text = $"{_clientCount} client(s) connected";
});
}
private void OnStartStreamingClicked(object sender, EventArgs e)
{
_mjpegServer.Start();
EventBuss.SendCommand(BussCommand.START_CAMERA_BACK);
EventBuss.SendCommand(BussCommand.START_CAMERA_FRONT);
}
private void OnStopStreamingClicked(object sender, EventArgs e)
{
_mjpegServer.Stop();
EventBuss.SendCommand(BussCommand.STOP_CAMERA_BACK);
EventBuss.SendCommand(BussCommand.STOP_CAMERA_FRONT);
}
protected override void OnDisappearing()
{
_rtspServer?.Stop();
_mjpegServer?.Stop();
base.OnDisappearing();
}
}
π Detailed API Documentation
Code Documentation
All classes in this library include comprehensive XML documentation comments for IntelliSense support. This provides:
- Class-level summaries describing the purpose of each component
- Method documentation with
<param>,<returns>, and<summary>tags - Property documentation explaining what each property represents
- Event documentation describing when events are raised
Documented Classes:
| Category | Classes |
|----------|---------|
| Models | Client, FrameEventArgs, H264FrameEventArgs, VideoProfile, VideoResolution, ServerConfiguration, RtspRequest, RtspAuth, EncoderInfo |
| Enums | AuthType, CodecType, BussCommand, TransportMode, VideoResolution |
| Services | Server, MjpegServer, FrontCameraService, BackCameraService |
| Encoders | H264Encoder, MediaTekH264Encoder |
| Utilities | EventBuss, FrameConverterHelper, FrameCallback |
| Interfaces | ICameraService, IAuthenticationManager, IClientManager, IH264EncoderManager, IRtcpManager, IRtpPacketBuilder, IRtspProtocolHandler, ISdpGenerator, IStreamingController, ITransportManager |
| RTSP Modules | AuthenticationManager, ClientManager, H264EncoderManager, JpegEncoderService, RtcpManager, RtpPacketBuilder, RtspProtocolHandler, SdpGenerator, StreamingController, TransportManager, FramePacer |
Server Class
The main RTSP server implementation that handles client connections and streaming.
Constructor
public Server(
int Port = 7778, // RTSP server port
int MaxClients = 100, // Maximum concurrent clients
string Address = "0.0.0.0", // Bind address
Dictionary<string, string>? Users = null, // Authentication users (optional, default admin user is created)
bool BackCameraEnabled = true, // Enable or disable back camera
bool FrontCameraEnabled = true, // Enable or disable front camera
bool AuthRequired = true, // Disable full auth ignoring if a Users dict was passed (recommended just for testing)
int MjpegServerQuality = 80, // Sets a default Mjpeg Image compression quality
int MjpegServerPort = 8089, // MJPEG HTTP server port
bool UseHttps = false, // Enable HTTPS for MJPEG server
string? CertificatePath = null, // Path to SSL certificate
string? CertificatePassword = null, // Certificate password
VideoResolution BackCameraResolution = VideoResolution.VGA_640x480, // Back camera resolution
VideoResolution FrontCameraResolution = VideoResolution.VGA_640x480 // Front camera resolution
)
public Server(
ServerConfiguration config // Simple class to configure the server
)
ServerConfiguration Class
A configuration class that simplifies server initialization with all available options.
public class ServerConfiguration
{
public int Port { get; set; } = 7778; // RTSP server port
public int MaxClients { get; set; } = 10; // Maximum concurrent clients
public Dictionary<string, string> Users { get; set; } = new(); // Authentication users
public int MjpegServerQuality { get; set; } = 80; // MJPEG compression quality
public int MjpegServerPort { get; set; } = 8089; // MJPEG HTTP server port
public bool AuthRequired { get; set; } = true; // Enable authentication
public bool FrontCameraEnabled { get; set; } = true; // Enable front camera
public bool BackCameraEnabled { get; set; } = true; // Enable back camera
public bool StartMjpegServer { get; set; } = true; // Auto-start MJPEG server
public bool EnableServer { get; set; } = true; // Enable/disable server startup
public string BaseAddress { get; set; } = "0.0.0.0"; // Bind address
public VideoProfile PrimaryProfile { get; set; } = new(); // Primary video profile
public VideoProfile SecondaryProfile { get; set; } = new(); // Secondary video profile
// Resolution configuration (affects H.264 encoder)
public VideoResolution BackCameraResolution { get; set; } = VideoResolution.VGA_640x480; // Back camera resolution preset
public int BackCameraWidth { get; set; } = 0; // Custom back camera width (0 = use preset)
public int BackCameraHeight { get; set; } = 0; // Custom back camera height (0 = use preset)
public VideoResolution FrontCameraResolution { get; set; } = VideoResolution.VGA_640x480; // Front camera resolution preset
public int FrontCameraWidth { get; set; } = 0; // Custom front camera width (0 = use preset)
public int FrontCameraHeight { get; set; } = 0; // Custom front camera height (0 = use preset)
// Helper methods
public int GetBackCameraWidth() => BackCameraWidth > 0 ? BackCameraWidth : BackCameraResolution.GetWidth();
public int GetBackCameraHeight() => BackCameraHeight > 0 ? BackCameraHeight : BackCameraResolution.GetHeight();
public int GetFrontCameraWidth() => FrontCameraWidth > 0 ? FrontCameraWidth : FrontCameraResolution.GetWidth();
public int GetFrontCameraHeight() => FrontCameraHeight > 0 ? FrontCameraHeight : FrontCameraResolution.GetHeight();
// HTTPS configuration for MJPEG server
public bool UseHttps { get; set; } = false; // Enable HTTPS
public string? CertificatePath { get; set; } // SSL certificate path
public string? CertificatePassword { get; set; } // Certificate password
}
VideoProfile Class
Configuration for video encoding parameters.
public class VideoProfile
{
public string Name { get; set; } = ""; // Profile name (used in URL path)
public VideoResolution? Resolution { get; set; } // Resolution preset (auto-sets Width, Height, and bitrates)
public int Width { get; set; } = 640; // Video width (setting clears Resolution preset)
public int Height { get; set; } = 480; // Video height (setting clears Resolution preset)
public int MaxBitrate { get; set; } = 4000000; // Maximum bitrate (bps)
public int MinBitrate { get; set; } = 500000; // Minimum bitrate (bps)
public int Quality { get; set; } = 80; // JPEG quality (10-100)
// Helper methods
public int GetFrameBufferSize() => (Width * Height * 3) / 2; // YUV420 buffer size for H.264
public (int Width, int Height) GetDimensions() => (Width, Height);
}
VideoResolution Enum
Predefined video resolution presets for camera capture and H.264 encoding. Now supports up to 4K UHD!
public enum VideoResolution
{
QVGA_320x240, // 320x240 - Lowest quality, minimal bandwidth (~115 KB buffer)
Low_480x360, // 480x360 - Low quality (~259 KB buffer)
VGA_640x480, // 640x480 - Standard (default), most compatible (~460 KB buffer)
SVGA_800x600, // 800x600 - Enhanced standard (~720 KB buffer)
HD_1280x720, // 1280x720 - HD 720p (~1.38 MB buffer)
FullHD_1920x1080, // 1920x1080 - Full HD 1080p (~3.11 MB buffer)
QHD_2560x1440, // 2560x1440 - QHD/2K (~5.53 MB buffer)
UHD_3840x2160 // 3840x2160 - 4K UHD (~12.4 MB buffer)
}
// Extension methods
public static int GetWidth(this VideoResolution resolution);
public static int GetHeight(this VideoResolution resolution);
public static int GetRecommendedMinBitrate(this VideoResolution resolution);
public static int GetRecommendedMaxBitrate(this VideoResolution resolution);
public static (int Width, int Height) GetDimensions(this VideoResolution resolution);
public static int GetFrameBufferSize(this VideoResolution resolution);
public static string GetDisplayName(this VideoResolution resolution);
Resolution and H.264 Relationship: | Resolution | Frame Buffer | Min Bitrate | Max Bitrate | |------------|-------------|-------------|-------------| | QVGA (320x240) | ~115 KB | 300 Kbps | 500 Kbps | | Low (480x360) | ~259 KB | 500 Kbps | 800 Kbps | | VGA (640x480) | ~460 KB | 800 Kbps | 1.5 Mbps | | SVGA (800x600) | ~720 KB | 1 Mbps | 2 Mbps | | HD (1280x720) | ~1.38 MB | 2 Mbps | 4 Mbps | | Full HD (1920x1080) | ~3.11 MB | 4 Mbps | 8 Mbps | | QHD/2K (2560x1440) | ~5.53 MB | 8 Mbps | 16 Mbps | | 4K UHD (3840x2160) | ~12.4 MB | 15 Mbps | 30 Mbps |
Note: High resolutions (QHD, 4K) require devices with hardware encoder support. The library automatically validates encoder capabilities and falls back to the nearest supported resolution if the requested resolution is not available.
Methods
// Start the RTSP server
public bool Start()
// Stop the server and cleanup resources
public void Stop()
// Add new user to the "database" *NO REBOOT REQUIRED
public bool AddUser(string username, string password)
// Remove user from the "database" *NO REBOOT REQUIRED
public bool RemoveUser(string username)
// Update user from the "database" *NO REBOOT REQUIRED
public bool UpdateUser(string username, string password)
// Set back camera resolution using preset (call before Start() for best results)
public void SetBackCameraResolution(VideoResolution resolution)
// Set back camera resolution using custom dimensions
public void SetBackCameraResolution(int width, int height)
// Set front camera resolution using preset (call before Start() for best results)
public void SetFrontCameraResolution(VideoResolution resolution)
// Set front camera resolution using custom dimensions
public void SetFrontCameraResolution(int width, int height)
// Get current back camera resolution
public (int Width, int Height) GetBackCameraResolution()
// Get current front camera resolution
public (int Width, int Height) GetFrontCameraResolution()
// Static method to encode YUV data to JPEG
public static byte[] EncodeToJpeg(byte[] rawImageData, int width, int height, Android.Graphics.ImageFormatType format)
Events
// Fired when streaming state changes
public static event EventHandler<bool>? OnStreaming;
// Fired when client list changes
public static event Action<List<Client>>? OnClientsChange;
// Fired when new frame is available from back camera (for general purpose use: snapshots, processing, etc.)
public static event EventHandler<FrameEventArgs>? OnNewBackFrame;
// Fired when new frame is available from front camera (for general purpose use: snapshots, processing, etc.)
public static event EventHandler<FrameEventArgs>? OnNewFrontFrame;
MjpegServer Class
HTTP server for MJPEG streaming, perfect for web browser integration.
Constructor
public MjpegServer(
int port = 8089, // HTTP server port
int quality = 30, // JPEG compression quality (1-100)
string bindAddress = "*", // Bind address ("*" for all interfaces)
bool authEnabled = false, // Enable Basic HTTP authentication
Dictionary<string, string>? users = null, // Authentication users
bool useHttps = false, // Enable HTTPS
string? certificatePath = null, // Path to SSL certificate
string? certificatePassword = null, // Certificate password
int maxFrameRate = 30 // Maximum FPS per client (frame rate limiting)
)
Methods
// Start the MJPEG HTTP server
public void Start(bool StartWithoutStream = false)
// Stop the server
public void Stop()
// Get latest back camera JPEG frame (for snapshot endpoints)
public byte[]? GetLatestBackFrame()
// Get latest front camera JPEG frame (for snapshot endpoints)
public byte[]? GetLatestFrontFrame()
Properties
// Connected client counts
public int ClientCount { get; } // Total connected clients
public int BackClientCount { get; } // Back camera clients
public int FrontClientCount { get; } // Front camera clients
// Real-time FPS tracking
public double BackCameraFps { get; } // Current back camera FPS
public double FrontCameraFps { get; } // Current front camera FPS
// Total frame counters
public long TotalBackFrames { get; } // Total back camera frames processed
public long TotalFrontFrames { get; } // Total front camera frames processed
Endpoints
http://your-ip:port/Back/- Back camera streamhttp://your-ip:port/Front/- Front camera stream
Camera Services
Low-level camera access services for frame capture.
BackCameraService / FrontCameraService
// Start camera capture
public void StartCapture(int width = 640, int height = 480)
// Stop camera capture
public void StopCapture()
// Event fired when new frame is available
public event EventHandler<FrameEventArgs>? FrameReceived;
// Event fired when error occurs
public event EventHandler<string>? ErrorOccurred;
EventBus System
Decoupled communication system for controlling services.
// Available commands
public enum BussCommand
{
START_CAMERA_FRONT,
STOP_CAMERA_FRONT,
START_CAMERA_BACK,
STOP_CAMERA_BACK,
START_MJPEG_SERVER,
STOP_MJPEG_SERVER,
SWITCH_CAMERA // New command implemented, but without specific functions at the moment
}
// Send command
EventBuss.SendCommand(BussCommand.START_CAMERA_BACK);
// Subscribe to commands
EventBuss.Command += (command) => {
// Handle command
};
π§ Advanced Configuration
Authentication (Required)
Authentication is mandatory for the RTSP server. You must provide user credentials when initializing the server.
// Required: Provide authentication users
var users = new Dictionary<string, string>
{
{ "admin", "secure_password" },
{ "viewer", "readonly_pass" },
{ "mobile", "mobile123" }
};
var server = new Server(
Port: 7778,
Users: users // This parameter is required
);
Note: The server uses Digest authentication by default for security. Basic authentication is also supported for compatibility.
H.264 Encoder Configuration
The H.264 encoder automatically optimizes for MediaTek and other Android devices:
// The encoder is automatically configured when streaming starts
// Default settings:
// - Bitrate: 2,000,000 bps (2 Mbps)
// - Frame rate: 25 FPS
// - Profile: Baseline
// - Keyframe interval: 1 second (IDR every ~25 frames)
// - I-frame interval: SetInteger (not SetFloat β critical for MediaTek compatibility)
// - RTP timestamps: Wall-clock based (Stopwatch) for cross-SoC reliability
// Dynamic bitrate adjustment happens automatically based on network conditions
MediaTek Device Notes:
- The encoder uses
SetInteger(KeyIFrameInterval, 1)instead ofSetFloat(). MediaTek MT6768 (and possibly other MediaTek SoCs) misinterprets sub-second float values as0, causing every frame to become an IDR keyframe. This exhausts the encoder's internal buffers after ~1000 frames and causes a permanent stall. - RTP timestamps are derived from
Stopwatchwall-clock time instead of the encoder'sPresentationTimeUs. The MT6768 reportsPresentationTimeUsin units ~1000x larger than microseconds, which would cause RTP timestamp deltas of ~3,000,000 per frame instead of the expected ~3,600 (at 25fps/90kHz). Players would buffer forever waiting for "future" frames. - The encoding loop drains output buffers before feeding new input to prevent buffer starvation on resource-constrained SoCs.
Video Resolution Configuration
Resolution directly affects H.264 encoder performance and bandwidth requirements. Higher resolutions need more processing power and network bandwidth.
Using Resolution Presets (Recommended)
using BaluMediaServer.Models;
// Option 1: Configure via ServerConfiguration
var config = new ServerConfiguration
{
Port = 7778,
BackCameraResolution = VideoResolution.HD_1280x720, // HD for back camera
FrontCameraResolution = VideoResolution.VGA_640x480, // Standard for front camera
Users = new Dictionary<string, string> { { "admin", "password" } }
};
var server = new Server(config);
// Option 2: Configure via constructor parameters
var server = new Server(
Port: 7778,
BackCameraResolution: VideoResolution.HD_1280x720,
FrontCameraResolution: VideoResolution.VGA_640x480,
Users: new Dictionary<string, string> { { "admin", "password" } }
);
// Option 3: Configure at runtime (before Start())
var server = new Server(Port: 7778, Users: users);
server.SetBackCameraResolution(VideoResolution.HD_1280x720);
server.SetFrontCameraResolution(VideoResolution.VGA_640x480);
server.Start();
Using Custom Resolutions
// Via ServerConfiguration
var config = new ServerConfiguration
{
BackCameraWidth = 800,
BackCameraHeight = 600,
FrontCameraWidth = 640,
FrontCameraHeight = 480
};
// Via Server methods
server.SetBackCameraResolution(800, 600);
server.SetFrontCameraResolution(640, 480);
// Direct camera service (advanced usage)
var backCamera = new BackCameraService();
backCamera.StartCapture(1280, 720);
Resolution Selection Guidelines
| Use Case | Recommended Resolution | Notes |
|---|---|---|
| Low bandwidth / Mobile data | QVGA (320x240) or Low (480x360) | Minimal data usage |
| Standard streaming | VGA (640x480) | Default, most compatible |
| High quality local network | HD (1280x720) | Good balance |
| Professional quality | Full HD (1920x1080) | Requires powerful device |
| Ultra-high quality | QHD/2K (2560x1440) | High-end devices only |
| Maximum quality | 4K UHD (3840x2160) | Flagship devices, high bandwidth required |
Important: The H.264 encoder buffer size is calculated as (width * height * 3) / 2 for YUV420 format. Higher resolutions significantly increase memory usage. The library uses dynamic buffer management to prevent OOM crashes at high resolutions.
Frame Capture for Snapshots and Processing
The server provides general-purpose frame events that can be used for snapshots, image processing, or custom applications. These events work independently of the streaming functionality.
// Subscribe to frame events for custom processing
Server.OnNewBackFrame += (sender, frameArgs) => {
// frameArgs.Data contains raw YUV data
// frameArgs.Width, frameArgs.Height contain dimensions
// frameArgs.Timestamp contains capture timestamp
// Convert to JPEG for snapshot
var jpegData = Server.EncodeToJpeg(
frameArgs.Data,
frameArgs.Width,
frameArgs.Height,
Android.Graphics.ImageFormatType.Nv21
);
// Save snapshot
await SaveSnapshotAsync(jpegData);
// Or perform custom image processing
ProcessFrame(frameArgs.Data, frameArgs.Width, frameArgs.Height);
};
Server.OnNewFrontFrame += (sender, frameArgs) => {
// Same processing for front camera frames
HandleFrontCameraFrame(frameArgs);
};
// Manually trigger camera capture for snapshots using EventBus
private async Task TakeSnapshot()
{
// Start camera temporarily if not already running
EventBuss.SendCommand(BussCommand.START_CAMERA_BACK);
// Wait for frame capture
await Task.Delay(500);
// Stop camera if not needed for streaming
if (!IsCurrentlyStreaming())
{
EventBuss.SendCommand(BussCommand.STOP_CAMERA_BACK);
}
}
// Example: Automatic snapshot every 30 seconds
private async void StartPeriodicSnapshots()
{
while (true)
{
await TakeSnapshot();
await Task.Delay(TimeSpan.FromSeconds(30));
}
}
π Network Usage
RTSP URLs
- Back Camera (H.264):
rtsp://your-ip:7778/live/back - Front Camera (H.264):
rtsp://your-ip:7778/live/front - Back Camera (MJPEG):
rtsp://your-ip:7778/live/back/mjpeg - Front Camera (MJPEG):
rtsp://your-ip:7778/live/front/mjpeg
HTTP MJPEG URLs
- Back Camera:
http://your-ip:8089/Back/ - Front Camera:
http://your-ip:8089/Front/
Connecting with Popular Clients
VLC Media Player
- Open VLC
- Go to Media β Open Network Stream
- Enter:
rtsp://admin:password123@your-ip:7778/live/back - Click Play
FFmpeg
# View stream
ffmpeg -i rtsp://admin:password123@your-ip:7778/live/back -f sdl output
# Record stream
ffmpeg -i rtsp://admin:password123@your-ip:7778/live/back -c copy output.mp4
# Re-stream to another server
ffmpeg -i rtsp://admin:password123@your-ip:7778/live/back -c copy -f rtsp rtsp://other-server/stream
OBS Studio
- Add Source β Media Source
- Uncheck "Local File"
- Input:
rtsp://admin:password123@your-ip:7778/live/back - Click OK
Web Browser (MJPEG only)
<img src="http://your-ip:8089/Back/" alt="Live Stream" />
π Troubleshooting
Common Issues
Camera Permission Denied
// Always request permissions before starting
var status = await Permissions.RequestAsync<Permissions.Camera>();
if (status != PermissionStatus.Granted)
{
// Handle permission denied
await DisplayAlert("Error", "Camera permission is required", "OK");
return;
}
Port Already in Use
try
{
var server = new Server(Port: 7778);
bool started = server.Start();
if (!started)
{
// Try alternative port
server = new Server(Port: 7779);
started = server.Start();
}
}
catch (Exception ex)
{
Console.WriteLine($"Server start failed: {ex.Message}");
}
Network Connectivity Issues
// Check network connectivity
var networkAccess = Connectivity.Current.NetworkAccess;
if (networkAccess != NetworkAccess.Internet)
{
await DisplayAlert("Error", "No network connection", "OK");
return;
}
// Get local IP address for clients to connect
var localIP = server.GetLocalIpAddress();
Console.WriteLine($"Connect to: rtsp://{localIP}:7778/live/back");
H.264 Encoding Issues
// Check if device supports hardware encoding
try
{
var encoder = new MediaTekH264Encoder(640, 480);
bool started = encoder.Start();
if (!started)
{
// Fallback to MJPEG
Console.WriteLine("H.264 not supported, using MJPEG");
}
}
catch (Exception ex)
{
Console.WriteLine($"H.264 encoder error: {ex.Message}");
}
H.264 Stream Freezes After a Few Seconds
If the H.264 stream starts but freezes after a few seconds (while MJPEG continues working), check these common causes:
1. All-IDR Output (MediaTek devices)
- Symptom: Every encoded frame is a keyframe (IDR), encoder stalls after ~1000 frames
- Cause: Using
SetFloat(KeyIFrameInterval, value)with sub-second values β MediaTek interprets this as0(every-frame IDR) - Fix: Always use
SetInteger(KeyIFrameInterval, N)where N >= 1. The library handles this automatically since v1.5.16 - Diagnostic: Check encoder output logs for
key=Trueon every frame
2. RTP Timestamp Mismatch
- Symptom: Stream appears frozen in VLC/ffplay, but encoder is producing frames
- Cause: MediaCodec
PresentationTimeUsmay not be in microseconds on all SoCs (MT6768 reports values ~1000x larger) - Fix: The library uses
Stopwatchwall-clock time for RTP timestamp derivation since v1.5.16, which works regardless of encoder timestamp units - Diagnostic: Check RTP timestamp deltas β at 25fps/90kHz they should be ~3,600 per frame. If they're ~3,000,000, the encoder timestamps are in wrong units
3. Encoder Thread Safety
- Symptom: Stream freezes after ~2 frames
- Cause: Concurrent JNI calls to MediaCodec from different threads
- Fix: All MediaCodec access (input and output) must be serialized on a single thread. Fixed since v1.5.15
4. Client Lifecycle Issues
- Symptom: Stream works briefly then stops, client appears disconnected
- Cause: Aggressive timeout settings or premature client cleanup
- Fix: The library uses graduated error counting (10 consecutive failures for TCP, 5 for UDP) and checks
IsPlayingbefore marking clients as dead. Fixed since v1.5.16
Performance Optimization
Memory Management
// Properly dispose of resources
protected override void OnDisappearing()
{
_rtspServer?.Stop(); // Stops all services
_mjpegServer?.Stop(); // Stops HTTP server
_frontCamera?.StopCapture(); // Stops camera capture
_backCamera?.StopCapture(); // Stops camera capture
base.OnDisappearing();
}
Reduce Latency & Improve Reliability
// TCP transport is recommended over UDP for reliability
// The client (VLC, FFmpeg, etc.) will automatically negotiate transport
// To force TCP in VLC: go to Tools > Preferences > Input/Codecs > Network > RTP over RTSP (TCP)
// For FFmpeg, use TCP explicitly:
// ffmpeg -rtsp_transport tcp -i rtsp://admin:password@your-ip:7778/live/back output.mp4
// Reduce frame rate for better performance
frontCamera.StartCapture(640, 480); // Lower resolution = better performance
// Monitor client count and adjust quality
Server.OnClientsChange += (clients) => {
if (clients.Count > 5)
{
// Reduce quality for multiple clients
EventBuss.SendCommand(BussCommand.STOP_CAMERA_FRONT);
}
};
Connection Stability Notes:
- The server uses graduated error counting: TCP clients tolerate up to 10 consecutive send failures before being disconnected, UDP clients tolerate 5. This prevents premature disconnection from transient network issues.
- Playing clients are protected from the WatchDog β they are never marked as dead while actively streaming.
- Frame dequeue uses a 2-second timeout to prevent the streaming loop from blocking forever if the H.264 encoder stalls.
- Per-client
SemaphoreSlim(SendLock) serializes all sends to prevent TCP interleaved framing corruption.
Transport Protocol Recommendations
// Current known issue: UDP transport may fail when switching between transport modes
// Workaround: Use TCP transport which is more reliable
// In VLC Media Player:
// 1. Go to Tools > Preferences
// 2. Show settings: All
// 3. Navigate to Input / Codecs > Network
// 4. Set "RTP over RTSP (TCP)" to "Always"
// In FFmpeg:
// Use the -rtsp_transport tcp flag
Images
Streaming back camera (ffplay)
Streaming front camera (ffplay)
Mobile App Interface
π§ͺ Testing
The project includes a comprehensive unit test suite using xUnit and FluentAssertions.
Running Tests
# Navigate to test project
cd BaluMediaServer.Tests
# Run all tests
dotnet test
# Run tests with detailed output
dotnet test --verbosity normal
# Run with code coverage
dotnet test --collect:"XPlat Code Coverage"
# Run specific test class
dotnet test --filter "FullyQualifiedName~VideoProfileTests"
Test Coverage
| Component | Tests | Coverage |
|---|---|---|
| VideoProfile | 22 | Quality clamping, name sanitization, defaults |
| ServerConfiguration | 27 | All property defaults, HTTPS config |
| EventBuss | 7 | Command propagation, subscriptions |
| RtspRequest | 21 | CSeq parsing, headers, properties |
| Enums | 57 | AuthType, CodecType, BussCommand validation |
| Total | 134 | ~90% of testable code |
Test Structure
BaluMediaServer.Tests/
βββ Unit/
β βββ Models/
β β βββ VideoProfileTests.cs
β β βββ ServerConfigurationTests.cs
β β βββ RtspRequestTests.cs
β βββ Infrastructure/
β β βββ EventBussTests.cs
β βββ Enums/
β βββ EnumTests.cs
βββ BaluMediaServer.Tests.csproj
Note on Android-Dependent Code
Unit tests cover pure C# components. Android-dependent classes (camera services, encoders, network servers) require integration testing on an Android device/emulator.
β οΈ Current Limitations
- Platform Support: Only Android 8.0+ (API 26+) is currently supported
- Framework Support: Tested with .NET 9.0 (partial support on .NET 8.0)
- iOS Support: Not available (long-term roadmap item)
- Image orientation: Some devices may experience image rotation issues
- Blazor preview: Some devices may experience issues displaying images from the MJPEG Server, depending on LoopBack access restrictions
π£οΈ Roadmap
Completed (v1.1-v1.5.16)
- β Fix H.264 stream stutter issues
- β
Add support for multiple profiles/routes (
/live/front,/live/back) - β Add user/password control panel
- β Add bitrate/resolution configuration
- β Fix UDP transport reliability issues
- β Reduce streaming latency (target: <100ms)
- β Add comprehensive code documentation
- β NuGet package distribution
- β External network access for MJPEG server
- β HTTPS support for MJPEG server
- β Basic authentication for MJPEG server
- β Unit testing infrastructure with xUnit
- β 4K UHD and QHD resolution support
- β High resolution OOM crash fix
- β Automatic encoder resolution validation and fallback
- β Dynamic memory-optimized buffer management
- β Modular RTSP server architecture refactoring
- β Connection stability and timeout improvements (v1.5.9)
- β Event-driven architecture and shared JPEG encoding (v1.5.10)
- β Continuous streaming mode (v1.5.11)
- β Native library frame delivery fix (v1.5.12)
- β MJPEG streaming smoothness with per-client frame pacing (v1.5.13)
- β Client reconnection bug fix (v1.5.14)
- β H.264 thread safety fix and connection stability (v1.5.15)
- β H.264 stream freeze fix β MediaTek I-frame interval, RTP timestamps, client lifecycle (v1.5.16)
Planned (v1.6+)
- β¬ Fix image rotation on some devices
- β¬ Add H.265 (HEVC) codec support
- β¬ Add integration tests for Android-dependent code
Long Term (v2.0+)
- β¬ iOS support via .NET MAUI
- β¬ Audio streaming support
- β¬ WebRTC integration
- β¬ Cloud streaming integration
- β¬ Advanced analytics and monitoring
π License
This project is licensed under the MIT License - see the LICENSE file for details.
MIT License Summary:
- β Commercial use
- β Modification
- β Distribution
- β Private use
- β Liability
- β Warranty
π‘ Why This Matters
There are few (if any) options to integrate RTSP servers with Android using C# and MAUI. This project bridges that gap. While it's still in early stages, it already provides a clean way to stream camera feeds using modern C# toolingβand it's open for everyone to contribute, extend, or just use freely.
π Acknowledgments
- MediaTek for excellent hardware encoding support
- .NET MAUI Team for the amazing cross-platform framework
- Android Camera2 API for providing low-level camera access
- RTSP/RTP Specifications for the streaming protocols
- Open Source Community for inspiration and support
π¬ Support & Contact
- Issues: Please use GitHub Issues for bug reports and feature requests
- Discussions: Use GitHub Discussions for general questions and ideas
- Email: danielulrichtamayo@gmail.com
Getting Help
- Check the Documentation: This README covers most use cases
- Search Issues: Your question might already be answered
- Create an Issue: Provide detailed information about your problem
- Community: Join discussions with other developers
Patch Notes
v1.1.2: Adding at Server CTOR two new variables to handle if the front or back camera should be enabled, this avoid the problem that only one camera start on devices that can not handle both cameras at same time.
v1.1.3: Adding handling for auto-quality adjust based on rtcp control for MJPEG codec, allowing to increase or decrease the image quality to guarantee video stability over this codec. -- Adding a preview (WIP) for video profiles allowing to create custom paths for this new profiles, will allow to set a custom resolution, bitrate and more.
v1.1.4: Adding auth option into CTOR of Server class, to enable or disable auth on stream rtsp, adding feature to determina video quality into mjpeg server
v1.1.5: Fixing EventBuss command on Server class, if the server was started do not raise the flag into it, and sometimes make the app crash due to "Port already in use" or even using excesive CPU on multiple MJPEG servers. Adding to MJPEGServer preview of EventBuss to handle it by there, but needs sync with main server to avoid duplicate instances or commands.
v1.1.6: Adding ArrayPool to avoid ovearhead at GC with multiple byte[] creations like in RTP Packets. Adding .ConfigureAwait(false) on awaitable method to avoid context overhead, theorical from 100ms to 100 us, increase performance on fewer CPU resources devices.
v1.1.7: Fixing MJPEG Codec bugs avoiding crashes, fixing Watchdog that close prematurly some connections, fixing some issues with the preview.
v1.1.8: Fixing issues related with Camera Services, making that on camera or service closure do not allow to restart them.
v1.1.9: Adding user/password handling options.
v1.1.10: Adding a custom class 'ServerConfiguration' to handle more easily all the server configurations.
v1.1.11: Fixing Server to allow Configuration Class, fixing MjpegServer disposal on Server class, fixing MjpegServer to set a fixed bitrate to 30 fps and fixing CPU leaks.
v1.2.0: Adding new global encoder for compatiblity with multiple devices not only Mediatek and fixing some features from the server to handle clients.
v1.3.1: Major H.264 Stability and Stutter Fix This release targets and resolves a series of core issues in the H.264 streaming logic that caused stutter, frame overlapping, and "two-frame" freezes. The stream is now significantly smoother and more stable.
Fixed Critical Timestamp Conversion:
Problem: The server was incorrectly converting the camera encoder's timestamps. We discovered the encoder provides timestamps in nanoseconds, but the server was treating them as microseconds. This resulted in RTP timestamps being 1000x too large, causing players to think a single frame should last for 30+ seconds, leading to a "two-frame" freeze.
Fix: The timestamp conversion logic in EncoderTimestampToRtp has been corrected to divide by 1,000,000,000.0 (nanoseconds) instead of 1,000,000.0 (microseconds).
Corrected RTP Marker Bit Logic:
Problem: The RTP "Marker Bit" (M-bit), which signals the end of a video frame, was being set incorrectly (e.g., on every small NAL unit). This confused decoders, causing them to render frames on top of each other or get stuck.
Fix: The server now correctly tracks all NAL units and fragments belonging to a single frame. The M-bit is now set only on the absolute last RTP packet of the last NAL unit for that frame, as required by the H.264 spec.
Removed Conflicting Stream Pacing:
Problem: The streaming loop had two "pacemakers" fighting each other:
A fixed Task.Delay trying to send at 45 FPS (22ms).
The H.264 encoder, which was producing frames at 25 FPS (40ms).
Fix: The fixed Task.Delay has been removed for H.264 streaming. The loop is now event-driven: it sends a frame as soon as the encoder provides one and loops immediately. If no new frame is ready, it waits a tiny 10ms (to prevent 100% CPU usage) and checks again. This lets the encoder, not the server loop, dictate the stream's framerate.
Eliminated Network Send Latency (Nagle's Algorithm):
Problem: For TCP streams, the OS was likely bundling small RTP packets together before sending them (Nagle's Algorithm). This is good for file transfers but terrible for real-time video, as it introduces small, random delays perceived as micro-stutter.
Fix: Nagle's Algorithm is now explicitly disabled (NoDelay = true) on all accepted client sockets, ensuring every RTP packet is sent to the network immediately.
Removed H.264 Frame Lock Contention:
Problem: The encoder thread (writing a new frame) and the network thread (reading that frame) were using the same lock. This meant one thread often had to wait for the other, causing a "hiccup" in frame delivery.
Fix: This lock has been completely replaced with a high-performance, lock-free Interlocked.Exchange operation. This allows the encoder and network threads to swap frame data atomically without ever blocking each other, resulting in a smoother handoff from camera to network.
v1.4.0: Major H.264 Codec Overhaul and VLC Compatibility Fix. This release addresses critical issues in the H.264 implementation that caused stutter, timing problems, and complete playback failure on VLC and other strict players.
Fixed Critical Timestamp Unit Mismatch:
Problem: The encoder outputs timestamps in microseconds (PresentationTimeUs), but the server was treating them as nanoseconds. This resulted in RTP timestamps being 1000x smaller than expected, causing massive stutter, frame overlap, and timing desynchronization.
Fix: The timestamp conversion in
EncoderTimestampToRtpnow correctly converts microseconds to RTP units using fixed-point arithmetic:(deltaUs * 9 + 50) / 100(equivalent todeltaUs * 90000 / 1_000_000).Added VLC Compatibility (sprop-parameter-sets in SDP):
Problem: VLC and many strict players require
sprop-parameter-setsin the SDP to initialize the H.264 decoder. Without this, VLC would fail to decode the stream entirely.Fix: The SDP now dynamically includes base64-encoded SPS and PPS in the
a=fmtpline when available. The server caches these parameter sets as they're received from the encoder.Fixed NAL Unit Extraction (Multiple NALs per Frame):
Problem: The encoder was treating each output buffer as a single NAL unit, even when it contained multiple NAL units (e.g., SEI + IDR, or SPS + PPS combined). This caused incomplete frames and decoder confusion.
Fix: Added
ExtractNalUnitsFromFramemethod that properly parses start codes and extracts all NAL units from encoder output. The server now sends each NAL unit as a separate RTP packet (or FU-A fragmented if large).Fixed SPS/PPS Start Code Handling:
Problem: When extracting SPS/PPS from MediaFormat's csd-0/csd-1 buffers, the code assumed specific start code formats. Some encoders provide raw NAL data without start codes, others use 3-byte or 4-byte start codes.
Fix: Added robust
GetStartCodeLength,GetNalType, andEnsureStartCodehelper methods that handle all cases. Parameter sets are now normalized to 4-byte start codes for consistent handling.Fixed Multi-Client Frame Delivery:
Problem: Using
Interlocked.Exchangewith null replacement caused frames to be consumed by one client, leaving other clients without frames.Fix: Changed to non-destructive frame reading where each client tracks its own last-sent timestamp. Multiple clients can now receive the same frame, and per-client timestamp tracking prevents duplicate sends.
Improved Frame Dropping Strategy:
Problem: The encoder was aggressively dropping frames (keeping only 2 max), which could break B-frame prediction chains and cause visible stutter.
Fix: Frame queue limit increased to 3 with single-frame-at-a-time dropping. This provides better buffering while maintaining low latency.
Added RTCP Sender Reports:
Problem: VLC and other players use RTCP Sender Reports (SR) for clock synchronization and jitter buffer management. Without SR packets, players may exhibit poor sync and choppy playback.
Fix: The server now sends RTCP Sender Reports every 5 seconds. Each SR includes NTP timestamp, RTP timestamp, packet count, and octet count as per RFC 3550.
Enhanced Code Documentation:
Added XML documentation comments to all major methods explaining their purpose, parameters, and behavior.
Improved code readability with clear comments explaining RTP/RTCP protocol details.
v1.4.1: Client Connection Management Overhaul and H.264 Reliability Fix. This release addresses critical issues with abrupt client disconnection handling and encoder buffer size mismatches that caused delayed reconnections and missing video.
Fixed Critical Client Collection Bug (ConcurrentBag β ConcurrentDictionary):
Problem: The server used
ConcurrentBag<Client>withTryTake()for client cleanup.TryTake()removes a random element, not the specific client being cleaned up. This corrupted the client list over time, causing ghost clients and preventing proper cleanup.Fix: Replaced
ConcurrentBag<Client>withConcurrentDictionary<string, Client>. Client cleanup now usesTryRemove(client.Id, out _)to remove the exact client being disconnected.Added TCP Connection Timeout Detection:
Problem:
Socket.Connecteddoes not detect abrupt disconnections (network failure, process kill). The server would continue trying to stream to dead clients, blocking resources and preventing new clients from receiving video.Fix: Added comprehensive connection health tracking:
LastActivityTimetracks last successful send per clientConsecutiveSendErrorscounts sequential failures- 5-second send timeout using
CancellationTokenSourcedetects stuck connections - 30-second inactivity timeout in streaming loop catches zombie connections
- Client marked as disconnected after 3 consecutive errors or socket exception
Fixed H.264 Encoder Buffer Size Mismatch:
Problem: The encoder was initialized with camera-reported dimensions (e.g., 640x480), but some cameras send frames with different actual sizes (e.g., 640x640). This caused
Input buffer too smallerrors and dropped frames.Fix: Added
CalculateDimensionsFromFrameSize()method that:- Checks common resolutions against actual YUV420 frame size
- Calculates dimensions using width/height hints when possible
- Falls back to square aspect ratio calculation
- Ensures encoder is always initialized with correct frame dimensions
Improved UDP Error Handling:
Problem: UDP send errors weren't properly tracked, allowing broken connections to persist.
Fix: UDP sends now track activity time and consecutive errors. Clients are disconnected after 5 consecutive UDP errors or when network is unreachable.
Enhanced Error Logging:
Added detailed logging for connection timeouts, send failures, and dimension mismatches to aid debugging.
v1.5.0: Stability and Performance Improvements. This release focuses on connection reliability, faster disconnection detection, smoother video playback, and reduced latency.
Improved Socket Disconnection Detection:
Problem:
Socket.Connectedproperty doesn't reliably detect abrupt TCP disconnections (client crash, network loss, etc.). The server would continue streaming to dead connections for extended periods.Fix: Added
IsSocketConnected()method that usesSocket.Poll()to actively probe connection state. If poll returns readable but no data is available, the connection is confirmed closed. This detects dead connections within seconds instead of minutes.Reduced Reconnection Time (WatchDog Optimization):
Problem: WatchDog ran every 60 seconds, causing long delays before the server detected all clients were gone and could reset state for new connections.
Fix: WatchDog interval reduced to 5 seconds. Also improved logic to check
IsPlayingstatus alongside socket connection, actively clean up dead clients, and clear SPS/PPS caches when resetting streaming state.Fixed Frame Queue for Smoother Playback:
Problem: H.264 frames were stored in single variables (
_latestH264FrameBack/Front) usingInterlocked.Exchange. If the encoder produced frames faster than they could be sent, frames would be overwritten and lost, causing stuttering.Fix: Replaced single frame variables with
ConcurrentQueue<H264FrameEventArgs>(max 5 frames buffer). Frames are now queued and sent in order, preventing loss during brief processing delays.Optimized Encoder Input/Output Handling:
Problem: Encoder used 0ms timeout for buffer operations, causing missed buffers and aggressive frame dropping (max 1 frame in queue).
Fix:
- Input buffer dequeue timeout increased to 10ms
- Output buffer dequeue timeout increased to 10ms
- Frame queue limit increased from 1 to 3 frames
- Re-enabled sleep in encoding loop to prevent CPU spinning
Reduced Streaming Latency:
Fix: Multiple latency optimizations applied:
- Socket buffers reduced from 256KB to 64KB (less buffering delay)
- TCP_NODELAY explicitly set on all connections
- I-frame interval reduced from 2s to 1s (faster stream recovery)
- Inactivity timeout reduced from 30s to 10s (faster dead connection cleanup)
Enhanced Connection Health Checks:
Fix: Streaming loop now checks three conditions before each frame:
IsSocketConnected()for active connection state- Inactivity timeout (10 seconds with no successful send)
- Consecutive send errors (disconnects after 3 failures)
Fixed Frame Stride Padding Handling:
Problem: Android cameras may include row stride padding in YUV frames, causing buffer size mismatches (e.g., 640x480 camera sending 614,398 bytes instead of expected 460,800).
Fix: Added frame size normalization via truncation to expected size. While this may cause minor artifacts on some devices, it prevents encoder crashes and ensures video delivery.
v1.5.1: MJPEG Server External Access and Improvements. This release enables external device access to the MJPEG stream, making it usable from any device on the network via a simple
<img>tag.Enabled External Network Access:
Problem: MJPEG server was hardcoded to bind to
127.0.0.1(localhost only), making it impossible for external devices to access the stream.Fix: Changed default binding to
0.0.0.0(all interfaces). Added configurablebindAddressparameter and wildcard prefix support for broader compatibility.Added Optional Basic Authentication:
Problem: When exposed to the network, the MJPEG stream had no authentication, creating a security risk.
Fix: Added optional Basic HTTP authentication. When
AuthRequiredis enabled, clients must provide valid credentials. Authentication uses the same user database as the RTSP server.Added CORS Headers for Web Integration:
Fix: MJPEG responses now include proper CORS headers (
Access-Control-Allow-Origin: *) and cache control headers, enabling seamless integration with web pages on any domain.Fixed Front Camera Client Handling:
Problem:
WriteDataAsynconly checked_clientsBackdictionary for client ID lookup, causing Front camera clients to not receive frames properly.Fix: Added
isBackCameraparameter to correctly handle both Front and Back camera clients with proper dictionary lookups.Added Per-Client Timeout with Slow Client Protection:
Problem: One slow client could block frame delivery to all other clients due to
Task.WhenAll()waiting for everyone.Fix: Added
WriteDataAsyncWithTimeout()wrapper with 2-second timeout per client. Slow clients are automatically disconnected without affecting others.Fixed Memory Leak in Client Tracking:
Problem:
_clientLastFrameTimedictionary was never cleaned up when clients disconnected, causing memory accumulation.Fix: Client IDs are now properly removed from
_clientLastFrameTimein all cleanup paths (normal disconnect, timeout, error).Added Client Count Properties:
Fix: Added
ClientCount,BackClientCount, andFrontClientCountproperties for monitoring connected MJPEG clients.Added MjpegServerPort Configuration:
Fix: MJPEG server port is now configurable via
ServerConfiguration.MjpegServerPort(default: 8089) or constructor parameter.
Usage Example (External Access):
<img src="http://192.168.1.100:8089/Back/" alt="Live Stream" /> <img src="http://admin:password123@192.168.1.100:8089/Back/" alt="Live Stream" />v1.5.2: Comprehensive Code Documentation. Added XML documentation comments to all public classes, methods, properties, and events.
Models:
FrameEventArgs,H264FrameEventArgs,Client,VideoProfile,ServerConfiguration,RtspRequest,RtspAuth,EncoderInfo,AuthType,CodecType,BussCommand,TransportModeServices:
Server,MjpegServer,FrontCameraService,BackCameraServiceEncoders:
H264Encoder(general-purpose),MediaTekH264Encoder(MediaTek-optimized)Utilities:
EventBuss,FrameConverterHelper,FrameCallback,ICameraServiceBenefits:
- Full IntelliSense support in Visual Studio and VS Code
- Auto-generated API documentation capability
- Improved code maintainability and developer experience
v1.5.3: Unit Testing Infrastructure. Added comprehensive unit test suite using xUnit and FluentAssertions.
Test Project:
BaluMediaServer.Teststargetingnet9.0with file linking approach for cross-targeting compatibilityTest Coverage (134 tests):
VideoProfileTests(22 tests): Quality clamping, name sanitization, default valuesServerConfigurationTests(27 tests): All property defaults, HTTPS config, camera settingsEventBussTests(7 tests): Command propagation, multiple subscribers, unsubscribe handlingRtspRequestTests(21 tests): CSeq parsing, header handling, property initializationEnumTests(57 tests): AuthType, CodecType, BussCommand value validation and parsing
Tools: xUnit, FluentAssertions, Coverlet for code coverage
Run Tests:
cd BaluMediaServer.Tests dotnet test dotnet test --collect:"XPlat Code Coverage"
v1.5.4: Critical Bug Fixes for Server Startup and MJPEG Streaming. This release addresses multiple issues that could cause the server to fail silently or MJPEG streaming to not transmit video.
Fixed Wrong Event Unsubscription in Server.cs:
Problem: When stopping the back camera via
BussCommand.STOP_CAMERA_BACK, the code was incorrectly unsubscribing fromOnFrontFrameAvailableinstead ofOnBackFrameAvailable. This caused event handler leaks and potential memory issues.Fix: Corrected the event unsubscription to use
OnBackFrameAvailablefor the back camera service.Fixed MJPEG Server Initialization Issues:
Problem: The
MjpegServerwas being created with default parameters at field initialization, causing:- Memory leak from orphaned event subscriptions (the initial instance subscribed to static events but was replaced in the constructor)
- Lost configuration when MJPEG server was recreated (only
qualityparameter was passed, losing port, authentication, HTTPS settings, etc.)
Fix:
- Added stored fields for all MJPEG server settings (
_mjpegServerPort,_mjpegUseHttps,_mjpegCertificatePath,_mjpegCertificatePassword) - Created
CreateMjpegServer()helper method that uses all stored configuration - Changed
_mjpegServerto nullable with proper null checks throughout - All MJPEG server recreation points now use the helper method with full configuration
- Added stored fields for all MJPEG server settings (
Fixed Channel Initialization Race Condition in Camera Services:
Problem: In both
BackCameraServiceandFrontCameraService, the camera capture was started BEFORE the frame channel was created. This caused:NullReferenceExceptionif frames arrived immediately after starting capture- Potential frame loss during the race window
- The
BoundedChannelFullMode.Waitsetting could block the native camera callback thread
Fix:
- Reordered initialization to create the channel BEFORE starting camera capture
- Changed
BoundedChannelFullMode.WaittoDropOldestto prevent blocking the camera callback thread (real-time video should drop old frames, not block) - Added null checks in
OnFrameAvailable(),StopCapture(), andDispose()methods
Impact: These fixes resolve issues where the server would appear to start successfully but:
- MJPEG streaming would not transmit any video
- Camera services could crash on first frame
- Event handlers could leak memory over time
v1.5.5: Critical Server Startup Fix (EnableServer Flag). This release fixes a critical bug where servers created using
ServerConfigurationwould never start.Fixed Missing EnableServer Configuration Property:
Problem: When using the
Server(ServerConfiguration config)constructor, the internal_enabledflag was always set tofalse. This causedServer.Start()to return immediately without actually starting the server, as the start logic checksif (_enabled && !IsRunning)before proceeding.Root Cause: The
ServerConfigurationclass was missing theEnableServerproperty. Sincebooldefaults tofalsein C#, any server created via the configuration constructor would have_enabled = false, silently preventing startup.Fix:
- Added
EnableServerproperty toServerConfigurationclass with default value oftrue - Server constructor now correctly reads this value:
_enabled = configuration.EnableServer - Added comprehensive XML documentation explaining the property's purpose
- Added
Impact: This was a critical bug that caused servers initialized with
ServerConfigurationto appear to start (no errors thrown) but never actually accept connections. TheStart()method would returnfalsesilently. This fix ensures servers start correctly by default.Migration Note: Existing code using
ServerConfigurationwill now work correctly without any changes, asEnableServerdefaults totrue. If you need to disable a server, explicitly setEnableServer = false.
v1.5.6: Camera Resource Management Fix. This release fixes a critical bug where cameras would continue running after all clients disconnected, wasting device resources.
Fixed Camera Not Stopping When Clients Disconnect:
Problem: When RTSP streaming started, the code set
_isCapturingBack = true(or_isCapturingFront), but these flags were never reset when clients disconnected. The WatchDog checkedif (!_isCapturingBack)before stopping cameras, so it would skip stopping them because the flag was stilltrue.Fix:
- Removed the broken condition that prevented cameras from stopping
- Now properly resets
_isCapturingBackand_isCapturingFronttofalsewhen stopping cameras - Cameras now correctly stop when all RTSP clients disconnect
Improved MJPEG Client Detection:
Problem: The WatchDog only checked if MJPEG server was enabled (
_mjpegServerEnabled), not if it actually had connected clients. This meant cameras would keep running even when MJPEG server had no clients.Fix: Now checks
_mjpegServer?.ClientCountto determine if MJPEG has active clients before deciding to keep cameras running.Added Catch-All Resource Cleanup:
Fix: Added a second condition to catch the case where cameras are running (started by EventBuss or MJPEG) but all clients have disconnected:
- If no RTSP clients are playing AND no MJPEG clients are connected AND cameras are running, cameras will now stop
- Logs clearly indicate when cameras are stopped or kept running for clients
Impact: This fix prevents unnecessary battery drain and CPU usage when no clients are connected to the stream.
v1.5.7: Configurable Video Resolution. This release adds full support for configuring camera resolution, allowing users to select from predefined presets or specify custom resolutions.
New VideoResolution Enum:
Added
VideoResolutionenum with common presets: QVGA (320x240), Low (480x360), VGA (640x480), SVGA (800x600), HD (1280x720), Full HD (1920x1080)Each preset includes recommended bitrate settings for H.264 encoding
Extension methods provide helper functions:
GetWidth(),GetHeight(),GetRecommendedMinBitrate(),GetRecommendedMaxBitrate(),GetFrameBufferSize(),GetDisplayName()ServerConfiguration Resolution Support:
Added
BackCameraResolutionandFrontCameraResolutionproperties for preset selectionAdded
BackCameraWidth,BackCameraHeight,FrontCameraWidth,FrontCameraHeightfor custom resolutionsHelper methods
GetBackCameraWidth(),GetBackCameraHeight(), etc. resolve effective resolutionServer Class Enhancements:
Constructor now accepts
BackCameraResolutionandFrontCameraResolutionparametersNew methods:
SetBackCameraResolution(),SetFrontCameraResolution(),GetBackCameraResolution(),GetFrontCameraResolution()Camera capture and H.264 encoder now use configured resolution
VideoProfile Updates:
Added
Resolutionproperty for preset-based configurationSetting
Resolutionautomatically updatesWidth,Height,MinBitrate, andMaxBitrateHelper methods:
GetFrameBufferSize(),GetDimensions()H.264 Encoder Considerations:
Resolution directly affects encoder buffer size:
(width * height * 3) / 2for YUV420Higher resolutions require more processing power and bandwidth
Bitrate recommendations scale with resolution for optimal quality
Impact: Users can now easily configure video resolution to balance quality, bandwidth, and device performance. Default resolution remains VGA (640x480) for backward compatibility.
v1.5.8: Modular Architecture Refactoring and High Resolution Support. This release refactors the monolithic Server.cs into focused, testable modules and adds support for resolutions up to 4K UHD.
Major RTSP Server Modular Refactoring:
Problem: The original
Server.cswas 2,621 lines with 68 methods handling too many responsibilities (networking, protocol parsing, authentication, encoding, streaming, transport, etc.), making it difficult to maintain, test, and debug.Fix: Refactored into 9 focused modules with clear interfaces:
Module Responsibility Lines RtspProtocolHandlerRTSP request parsing and response generation ~326 SdpGeneratorSDP generation for H.264 and MJPEG ~173 AuthenticationManagerDigest/Basic authentication, nonce management ~275 TransportManagerUDP/TCP transport, port allocation ~254 RtpPacketBuilderRTP packet creation, NAL fragmentation ~292 RtcpManagerRTCP sender reports, receiver feedback ~242 H264EncoderManagerH.264 encoder lifecycle, frame queues ~435 StreamingControllerMain streaming orchestration ~340 ClientManagerClient lifecycle, cleanup, caching ~153 Server(reduced)Composition root, wiring ~700 New Directory Structure:
RTSP/ βββ Server.cs βββ Protocol/ β βββ IRtspProtocolHandler.cs, RtspProtocolHandler.cs β βββ ISdpGenerator.cs, SdpGenerator.cs βββ Transport/ β βββ ITransportManager.cs, TransportManager.cs β βββ IRtpPacketBuilder.cs, RtpPacketBuilder.cs β βββ IRtcpManager.cs, RtcpManager.cs βββ Streaming/ β βββ IH264EncoderManager.cs, H264EncoderManager.cs β βββ IStreamingController.cs, StreamingController.cs β βββ FramePacer.cs βββ Security/ β βββ IAuthenticationManager.cs, AuthenticationManager.cs βββ ClientManagement/ βββ IClientManager.cs, ClientManager.csBenefits:
- Single Responsibility: Each module handles one specific concern
- Testability: Interfaces enable dependency injection and unit testing
- Maintainability: Smaller files (~150-350 lines) are easier to navigate
- Error Tracking: Stack traces now point to specific modules
- Extensibility: Modules can be extended or replaced independently
Added 4K UHD and QHD Resolution Support:
New resolution presets: QHD/2K (2560x1440) and 4K UHD (3840x2160)
Encoder now supports fallback resolutions including 4K, QHD+, QHD, and FHD+
EncoderInfo now tracks
Supports4K,SupportsQHD,SupportsFullHD, andSupportsHDcapabilitiesFixed High Resolution OOM Crashes:
Problem: Streaming at resolutions higher than HD 720p (e.g., FullHD, 1920x1440, 4K) caused
OutOfMemoryErrorcrashes due to excessive frame buffer memory usage.Root Causes Identified:
- Fixed 25-frame camera buffer caused ~78MB memory usage at FullHD
- No encoder resolution validation before MediaCodec configuration
- No graceful fallback when encoder didn't support requested resolution
Fix:
- Dynamic Channel Capacity: Camera services now calculate buffer capacity based on resolution. Target ~8MB max buffer with 2-10 frames depending on resolution (vs. fixed 25 frames before)
- Encoder Resolution Validation:
IsResolutionSupported()method checks if encoder supports the requested resolution before configuration - Graceful Fallback:
GetNearestSupportedResolution()finds the nearest supported resolution if requested resolution isn't available - ActualWidth/ActualHeight Properties: Encoder exposes actual resolution being used after any fallback
- Server Dimension Updates: Server tracks and updates stored dimensions when encoder falls back to different resolution
EncoderInfo Enhancements:
Added
MaxSupportedWidthandMaxSupportedHeightpropertiesAdded
Supports4K,SupportsQHD,SupportsFullHD,SupportsHDboolean flagsThese are populated during encoder evaluation for capability reporting
Camera Library Improvements (Kotlin AAR):
Dynamic ImageReader buffer sizing based on resolution
Memory pressure detection to prevent OOM in camera capture layer
Optimized buffer pool management for high-resolution frames
Impact: The codebase is now more maintainable and testable. Users can safely request high resolutions (including 4K) without crashes. The library will automatically fall back to the nearest supported resolution if the device's encoder doesn't support the requested resolution.
v1.5.9: Connection Stability and Timeout Improvements. This release addresses premature disconnections by improving timeout handling, activity tracking, and error logging across all transport modes.
Fixed Premature Client Disconnections:
Problem: The 10-second inactivity timeout was too aggressive and would disconnect stable clients during network congestion or when TCP sends were timing out. Combined with a 5-second TCP send timeout, clients could be disconnected after just 2 consecutive send delays (10 seconds total).
Fix:
- Increased Inactivity Timeout: From 10 seconds to 60 seconds
- Improved Activity Tracking:
LastActivityTimeis now updated at the start of each streaming loop iteration, not just on successful sends. This prevents false inactivity timeouts as long as the streaming loop is actively running. - Reduced TCP Send Timeout: From 5 seconds to 3 seconds for faster stuck connection detection
- With the new settings, clients can experience up to 20 consecutive TCP send timeouts (~60 seconds) before being disconnected, providing much better tolerance for temporary network issues.
Enhanced RTCP Timeout Handling (UDP Mode):
Problem: The 60-second RTCP timeout was too aggressive for UDP clients that don't send RTCP packets regularly, causing premature disconnection of valid clients.
Fix:
- Increased RTCP Timeout: From 60 seconds to 120 seconds
- Added detailed logging for RTCP events (BYE packets, Receiver Reports, timeouts)
- Better distinction between server shutdown and client timeout in error handling
Improved Socket Health Detection:
Problem: Socket.Poll() was using a 1ms timeout which could be too aggressive for slower but stable connections.
Fix:
- Increased Socket Poll Timeout: From 1ms to 10ms
- More forgiving detection of socket disconnection while still catching dead connections quickly
Comprehensive Logging Enhancements:
Added detailed logging throughout the connection lifecycle:
- All disconnection events now log the specific reason (socket disconnected, inactivity timeout, consecutive errors)
- Transport mode (TCP/UDP) included in disconnection logs
- TCP/UDP send errors now log client IDs and consecutive error counts
- Socket error codes logged for better debugging
- Dead client detection logs explain why each client is marked as dead
- WatchDog cleanup operations are now logged with counts
Summary of New Timeout Values:
Setting Old Value New Value Purpose Inactivity Timeout 10s 60s Time before disconnecting idle clients TCP Send Timeout 5s 3s Timeout for individual TCP send operations RTCP Timeout (UDP) 60s 120s Timeout waiting for RTCP packets from UDP clients Socket Poll Timeout 1ms 10ms Timeout for socket connectivity checks Impact: RTSP connections are now significantly more stable, especially over congested networks or with clients that have slower connections. The enhanced logging makes it much easier to diagnose any connection issues that do occur. Device connections remain stable even during temporary network hiccups or when send operations experience delays.
v1.5.10: Major Performance Improvements - Event-Driven Architecture and Shared Encoding. This release addresses critical performance bottlenecks identified in performance analysis, delivering significantly improved frame rates, reduced CPU usage, and lower latency.
Fixed Issue 2.2: Shared JPEG Encoding Infrastructure (HIGHEST IMPACT):
Problem: RTSP-MJPEG clients were encoding each frame synchronously within their streaming loops. This CPU-intensive operation (
YuvImage.CompressToJpeg) blocked the entire streaming thread, causing severe frame drops and low FPS. Multiple clients would duplicate this expensive work for every frame.Solution: Created
JpegEncoderService- a centralized background encoding service:- Single Encoding Per Frame: Each frame is encoded only once, then shared across all MJPEG clients (both HTTP and RTSP)
- Background Tasks: Encoding happens in dedicated background tasks using
System.Threading.Channels - Non-Blocking Delivery: Clients consume pre-encoded JPEG frames without waiting for encoding
- Automatic Frame Dropping: Bounded channels with
DropOldestpolicy prevent memory buildup
Impact:
- Frame Rate: RTSP-MJPEG improved from ~5-10 FPS to ~25-30 FPS
- CPU Usage: Dramatically reduced, especially with multiple clients
- Scalability: CPU usage no longer scales linearly with client count
- Client Isolation: Slow clients don't block fast clients
Fixed Issue 2.1: Event-Driven H.264 Frame Delivery:
Problem: H.264 streaming relied on polling with
Task.Delay(10ms)to check for available frames. This wasted CPU cycles and added artificial 10ms minimum latency to every frame.Solution: Implemented async/await pattern with
Channel.Reader.ReadAsync():- Added
DequeueFrameAsync()method toIH264EncoderManager - Replaced
ConcurrentQueuewithSystem.Threading.Channels.ChannelinH264EncoderManager - Streaming threads now sleep efficiently until frames are available (zero CPU when idle)
- Frames delivered immediately when available (no polling delay)
- Added
Impact:
- Latency: Eliminated 10ms polling delay
- CPU Usage: Threads sleep instead of spin-waiting
- Battery Life: Reduced CPU usage improves battery on mobile devices
- Responsiveness: Frames delivered immediately when encoder produces them
Fixed Issue 2.4: Modernized H.264 Encoder Frame Queue:
Problem:
H264EncoderusedConcurrentQueuewith manual frame dropping logic (while loop checkingCount, manual dequeue).Solution: Replaced with
System.Threading.Channels.Channel:- Bounded channel with capacity of 2 frames
DropOldestpolicy automatically handles overflow- Simplified code by removing manual frame management
Impact:
- Code Quality: Cleaner, more maintainable implementation
- Efficiency: Better frame buffering with less overhead
- Reliability: Automatic backpressure management
Architecture Improvements:
New Components:
RTSP/Streaming/JpegEncoderService.cs: Centralized JPEG encoding serviceEncodedJpegFrameclass: Represents pre-encoded JPEG frames
Updated Components:
StreamingController: Now consumes from shared JPEG serviceH264EncoderManager: Event-driven async frame deliveryH264Encoder: Channel-based frame queuingServer: Integrates JpegEncoderService and feeds frames to it
Design Patterns:
- Producer-Consumer: Channels cleanly separate frame production from consumption
- Single Responsibility: JpegEncoderService handles only encoding concerns
- Async/Await: Modern .NET async patterns eliminate blocking and polling
- Bounded Buffers: Automatic backpressure management prevents memory issues
Performance Metrics:
Metric Before After Improvement RTSP-MJPEG FPS 5-10 25-30 3-6x faster H.264 Frame Latency 10ms+ (polling) Near-zero 10ms+ saved per frame CPU Usage (MJPEG) High, scales per client Low, shared encoding Linear to constant scaling Multiple Clients Each encodes separately Shared single encode N-1 encodes eliminated Not Implemented (Deferred):
Issue 2.3 (ArrayPool for Frame Buffers): Deferred to future release
- Requires invasive changes to camera service native interop
- Would affect
FrameEventArgsand all frame consumers - Risk/benefit analysis favors deferring given the substantial gains from other optimizations
- Can be revisited if GC pressure becomes an issue at 1080p+ resolutions
Testing Recommendations:
RTSP-MJPEG: Connect 3+ clients simultaneously and verify smooth 25+ FPS on all
H.264: Verify low latency and no frame drops under load
Memory: Monitor GC collections during extended streaming sessions
CPU: Compare CPU usage vs v1.5.9
Mixed Load: Test concurrent H.264 + MJPEG clients
Impact: This release delivers the most significant performance improvements in the project's history. RTSP-MJPEG is now viable for production use with multiple concurrent clients. H.264 streaming has reduced latency and CPU overhead. The codebase uses modern .NET async patterns throughout for better efficiency and maintainability. See
PERFORMANCE_IMPROVEMENTS.mdfor detailed technical analysis.
v1.5.11: Continuous Streaming Mode - Eliminated Stream Interruptions. This release disables all automatic stop mechanisms to ensure fluid, uninterrupted streaming without cuts or frame drops.
Problem: The WatchDog and auto-stop mechanisms were causing stream interruptions:
- Cameras stopped when last client disconnected
- Encoders stopped and restarted frequently
- Camera restart commands triggered by MJPEG watchdog
- Stream cuts and frame drops during client transitions
- Reconnection delays due to encoder restarts
Solution: Disabled all automatic stop mechanisms for continuous operation:
Disabled WatchDog Auto-Stop:
- WatchDog now only cleans up dead/disconnected clients (preserves important cleanup)
- Cameras keep running regardless of client count
- Encoders keep running regardless of client count
- Streaming state persists once started
- No more automatic encoder/camera stops
Disabled EventBus Camera Stop Commands:
STOP_CAMERA_FRONTandSTOP_CAMERA_BACKcommands logged but ignored- Prevents external code from interrupting streams
- Cameras stay active for instant client connections
Disabled MJPEG Watchdog Restarts:
- Watchdog logs frame delays but doesn't restart cameras
- No more camera restarts after temporary delays
- Eliminates stream cuts from camera restarts
Benefits:
- β Zero Interruptions: Streams never cut when clients disconnect/reconnect
- β Instant Reconnection: No encoder restart delay (was ~1-2 seconds)
- β Fluid Experience: No frame drops during client transitions
- β Production Ready: Reliable, predictable behavior
- β Better Multi-Client: New clients can connect instantly without affecting others
Trade-offs:
- β οΈ Continuous Resource Usage: Cameras and encoders run even with no clients
- β οΈ Battery Drain: Continuous operation uses more power on mobile devices
- β οΈ Manual Control: Must explicitly call
Stop()orDispose()to stop streaming
How to Stop:
- Call
server.Stop()to stop everything - Call
server.Dispose()to release all resources - Application exit automatically stops everything
- Call
Monitoring:
- WatchDog now logs status:
Active clients: RTSP=0, MJPEG=2, Cameras: Back=True, Front=False, Streaming=True - Helps monitor server state without auto-stop interference
- WatchDog now logs status:
Configuration:
- Currently hardcoded for maximum reliability
- See
CONTINUOUS_STREAMING_MODE.mdfor instructions to restore auto-stop if needed - Future: Configuration flag for hybrid mode (auto-stop on battery, continuous on power)
Files Changed:
RTSP/Server.cs: Disabled WatchDog auto-stop and EventBus camera stop commandsServices/MjpegServer.cs: Disabled watchdog camera restartsCONTINUOUS_STREAMING_MODE.md: Complete documentation
Performance Impact:
- CPU: Minimal - encoders efficient, H.264 only encodes when frames available
- Memory: Minimal - bounded buffers with DropOldest prevent buildup
- Battery: Moderate increase on mobile (camera always on)
- Network: Zero impact when no clients (no data sent)
Impact: Streams are now completely fluid with zero interruptions. Perfect for scenarios where reliability is critical (security cameras, monitoring systems, live broadcasts). The server maintains ready state for instant client connections. Trade-off of continuous resource usage is acceptable for server/desktop deployments and provides significantly better user experience.
v1.5.12: Native Library Frame Delivery Fix - Resolved Stream Stopping Issue. This release fixes a critical bug in the native Android camera library that caused streams to stop completely after running for a short time.
Problem: MJPEG streams would stop receiving frames after the native library's internal queue filled up:
- Logs showed:
Back camera queue backing up (81/100), dropping frame - After queue reached 80% capacity, ALL frames were dropped
- The queue was never consumed by any code (dead/unused feature)
- Once full, the queue stayed full forever, permanently blocking frame delivery
- Result: Stream worked initially, then stopped completely with no recovery
- Logs showed:
Root Cause Analysis:
// BEFORE (Broken) - in CameraFrameServicev2.kt if (queueSize > queueCapacity * 0.8) { // 80 frames Log.w(TAG, "Back camera queue backing up...") return // β DROPS FRAME ENTIRELY - never sent to callback! } backCameraCallback?.onFrameAvailable(frame) // Only reached if queue < 80% backCameraFrameQueue.offer(frame) // Queue never consumed!The frame dropping check was placed BEFORE sending to callbacks. When the unused queue filled up, it blocked the primary frame delivery path.
Solution: Restructured frame delivery to prioritize callbacks:
// AFTER (Fixed) // Send to callback FIRST - this is the primary consumer (MJPEG streaming) backCameraCallback?.onFrameAvailable(frame) // Queue is optional secondary storage - only add if there's room if (!backCameraFrameQueue.offer(frame)) { // Queue full - drop oldest, but callback already received the frame val droppedFrame = backCameraFrameQueue.poll() droppedFrame?.let { backBufferPool.release(it.data) } backCameraFrameQueue.offer(frame) }Files Changed:
AndroidLib/camerastreamer/src/main/java/CameraFrameServicev2.kt: Fixed frame delivery for both front and back cameras- Native library version bumped to v2.0.1
How to Update:
- Rebuild the AndroidLib:
./gradlew :camerastreamer:assembleRelease - Copy
camerastreamer-release.aartoBaluMediaServer/Jar/ - Rebuild your application
- Rebuild the AndroidLib:
Impact: This was the actual root cause of stream stopping issues. The fix ensures frames are always delivered to .NET regardless of internal queue state. Streams now run continuously without any frame drops or interruptions. The internal queue remains available for future use cases but no longer blocks primary frame delivery.
v1.5.13: MJPEG Streaming Smoothness Improvements. This release significantly improves MJPEG streaming smoothness with architectural improvements inspired by MauiJpegServer.
Problem: MJPEG streaming could feel choppy or have inconsistent frame delivery:
- Encoder pushed frames to all clients synchronously
- No per-client frame rate limiting
- Clients could starve each other on slow networks
- No real-time FPS tracking for diagnostics
Solution: New per-client streaming architecture:
SemaphoreSlim-Based Frame Signaling:
- Each client has its own streaming task that waits on a semaphore
- When encoder produces a frame, it signals all waiting clients simultaneously
- More efficient than polling-based approaches
- Clients wake up exactly when frames are available
Per-Client Frame Rate Limiting:
- Each client respects a configurable max frame rate (default 30 FPS)
- Prevents frame bursting that can cause network congestion
- Smoother, more consistent frame delivery
- New constructor parameter:
maxFrameRate
Real-Time FPS Tracking:
- Accurate FPS calculation using
Stopwatch - Watchdog logs FPS:
FPS: Back=29.8, Front=30.1 - New properties:
BackCameraFps,FrontCameraFps - Total frame counters:
TotalBackFrames,TotalFrontFrames
- Accurate FPS calculation using
Per-Client Streaming Tasks:
- Each client runs its own async streaming loop
- Clients are independent - slow client doesn't affect others
- Individual timeout and cleanup per client
- Better client lifecycle management with
ClientInfoclass
Latest Frame Access:
- New methods:
GetLatestBackFrame(),GetLatestFrontFrame() - Useful for snapshot endpoints
- Instant frame access without waiting
- New methods:
API Changes:
// New constructor parameter var server = new MjpegServer( port: 8089, quality: 75, maxFrameRate: 30 // NEW: Limit FPS per client ); // New properties double backFps = server.BackCameraFps; double frontFps = server.FrontCameraFps; long totalFrames = server.TotalBackFrames; // New methods for snapshots byte[]? latestFrame = server.GetLatestBackFrame();Files Changed:
Services/MjpegServer.cs: Complete rewrite of client streaming architecture
Performance Impact:
- Smoother frame delivery with consistent intervals
- Reduced jitter on variable network conditions
- Better multi-client performance (clients don't block each other)
- Lower latency for responsive clients
Impact: MJPEG streaming is now significantly smoother with consistent frame pacing. The new architecture ensures each client receives frames at a controlled rate, preventing the choppy playback that could occur with the previous push-based approach. Inspired by the clean architecture of MauiJpegServer while retaining BaluMediaServer's advanced features (authentication, HTTPS, etc.).
v1.5.14: Client Reconnection Bug Fix. This release fixes a critical race condition that caused streams to crash when clients disconnected and reconnected.
Problem: After a client disconnected and reconnected (or a new client connected), the stream would completely stop:
- Cameras appeared to crash (flashlight could be enabled, indicating camera release)
- New clients would block forever waiting for frames
- The issue occurred due to a semaphore race condition in the frame signaling mechanism
Root Cause Analysis:
- The semaphore release logic only released N times where N = current client count
- When client disconnected, count dropped to 0, so encoder released 0 times
- New clients connecting between frames would call
WaitAsync()but never receive a signal - This created a deadlock where new clients could never receive frames
- Additionally,
_streamStartedflag was never reset, preventing on-demand camera restart
Solution: Two-part fix for robust client handling:
Semaphore Always Releases At Least Once:
// Before (buggy): var clientCount = _clientsBack.Count; // Could be 0! // After (fixed): var clientCount = System.Math.Max(1, _clientsBack.Count); // Always >= 1- Ensures new clients connecting between frames can acquire the semaphore
- Prevents deadlock when client count temporarily drops to zero
SemaphoreFullExceptionstill prevents overflow
Reset Stream State on Last Client Disconnect:
// In HandleClient finally block: if (_clientsBack.Count == 0 && _clientsFront.Count == 0) { lock (_streamLock) { if (_clientsBack.Count == 0 && _clientsFront.Count == 0) { _streamStarted = false; // Allow on-demand restart } } }- Double-checked locking pattern for thread safety
- Allows cameras to restart on-demand when new clients connect
- Logs state change for debugging
Files Changed:
Services/MjpegServer.cs:- Lines 371, 419: Changed
Math.Max(1, count)for semaphore release - Lines 657-668: Added
_streamStartedreset in client cleanup
- Lines 371, 419: Changed
Testing:
- Connect MJPEG client, verify streaming works
- Disconnect client, wait a few seconds
- Reconnect (same or different device) - stream should resume immediately
- Verify no "flashlight available" state (cameras stay ready or restart on-demand)
Impact: Client reconnection now works reliably. The race condition that caused streams to appear "crashed" after disconnect/reconnect cycles is eliminated. This was a critical fix for production deployments where clients may frequently connect and disconnect.
v1.5.15: H.264 Thread Safety Fix and Connection Stability. This release fixes a critical threading bug that caused H.264 streams to freeze after ~2 frames, along with several connection reliability improvements.
H.264 Streaming Freeze Fix (Thread Safety):
Problem:
FeedFrame()was callingFeedInputBuffer()directly on the camera callback thread whileDrainOutputBuffer()ran on the encoder thread. These concurrent JNI calls to MediaCodec caused the encoder to stall after ~2 frames.Fix:
FeedFrame()now routes frames through_frameChannelso that bothFeedInputBuffer()andDrainOutputBuffer()are serialized on the encoder thread. This eliminates concurrent JNI access to MediaCodec.Encoder Channel Capacity:
Fix: Increased
_frameChannelbounded capacity from 2 to 5 frames, providing better buffering headroom and reducing frame drops during brief processing spikes.TOCTOU Socket Race Fix:
Problem: The streaming loop called
IsSocketConnected(usingSocket.Poll) on the same RTSP socket thatHandleClientwas reading from. This created a time-of-check-to-time-of-use race where the poll would consume data intended for the RTSP reader, causing false disconnection detection.Fix: Removed
IsSocketConnectedfrom the streaming loop. Connection health is now determined solely by send error counting, which is inherently race-free.SPS/PPS Deduplication:
Problem: SPS/PPS NAL units were being sent redundantly β both as separate parameter sets before keyframes and embedded within the keyframe data itself.
Fix: Added deduplication logic to skip SPS/PPS NAL units when they have already been sent separately before the keyframe, reducing bandwidth waste.
Transport SendLock Timeout:
Problem: The
_sendLockinTransportManagerused a CancellationToken-linked timeout. During server lifecycle events (shutdown, restart), the CTS could be cancelled, causing sends to fail silently instead of timing out normally.Fix: Changed to a fixed 3-second timeout (
TimeSpan.FromSeconds(3)) that is independent of the server CancellationTokenSource.Standalone Send CTS:
Problem: The send CancellationTokenSource was coupled to the server CTS, meaning server shutdown would immediately cancel in-flight sends without allowing graceful client cleanup.
Fix: Decoupled the send timeout CTS from the server CTS, allowing in-progress sends to complete or timeout naturally during shutdown.
Files Changed:
RTSP/H264Encoder.cs: Thread safety fix βFeedFrame()routes through channel; channel capacity increased to 5RTSP/Streaming/H264EncoderManager.cs: SPS/PPS deduplication logicRTSP/Streaming/StreamingController.cs: RemovedIsSocketConnectedTOCTOU race; standalone send CTSRTSP/Transport/TransportManager.cs: Fixed_sendLocktimeout to 3 seconds
Impact: H.264 streaming is now stable and no longer freezes after the first few frames. The thread safety fix resolves the root cause of MediaCodec JNI contention. Connection detection is more reliable without the TOCTOU race, and transport timeouts behave correctly during server lifecycle events.
v1.5.16: H.264 Stream Freeze Fix β MediaTek Encoder Quirks, RTP Timestamps, and Client Lifecycle. This release resolves the remaining causes of H.264 stream freezing on MediaTek devices through a combination of encoder configuration fixes, RTP timestamp correction, and client lifecycle hardening.
Encoder Stall from All-IDR Output (PRIMARY ROOT CAUSE):
Problem:
SetFloat(KeyIFrameInterval, 0.25f)was misinterpreted by the MediaTek MT6768 as0, causing every single frame to become an IDR keyframe. After ~1000 frames, the encoder's internal buffers were exhausted and it stalled permanently β no more output, but input still accepted.Diagnostic: Encoder output logs showed
key=Trueon every frame. After frame ~1000,Frame dequeue timeout (2000ms)appeared every 2 seconds with no further encoder output.Fix: Changed to
SetInteger(KeyIFrameInterval, 1). Always useSetInteger(notSetFloat) for I-frame interval on Android MediaCodec. Sub-second float values are unreliable on many SoCs. Value of 1 = one IDR keyframe per second (~25 frames at 25fps).RTP Timestamps 1000x Too Fast (CRITICAL):
Problem:
EncoderTimestampToRtptreated MediaCodec'sPresentationTimeUsas microseconds, but the MT6768 reports values in units approximately 1000x larger than microseconds. This produced RTP timestamp deltas of ~3,000,000 per frame instead of the expected ~3,600 (at 25fps/90kHz clock). Players like VLC interpreted frames as being 33 seconds apart and buffered forever, appearing frozen.Diagnostic: Added NAL diagnostic logging that revealed encoder timestamp deltas of ~33,333,000 between 25fps frames (should be ~40,000 if microseconds).
Fix: Replaced encoder-timestamp-based RTP derivation with
Stopwatchwall-clock time.BaseEncoderTimestampis repurposed to store theStopwatch.GetTimestamp()start tick. RTP offset is calculated aselapsedSeconds * 90000.0, which produces correct ~3,600 deltas regardless of encoder timestamp units. This approach is robust across all SoCs.Client Lifecycle Killing Active Streams:
Problem: Multiple lifecycle mechanisms (WatchDog, HandleClient, RTCP, TransportManager) were prematurely terminating streaming clients due to unreliable
Socket.Connectedchecks, single-error disconnection, and disposal race conditions that causedObjectDisposedExceptionin streaming tasks.Fixes:
GetDeadClients()checksIsPlayingfirst β playing clients are never marked as dead, only non-playing clients are subject to socket checks and grace period timeoutsTransportManageruses graduated error counting with a threshold of 10 consecutive failures (TCP) or 5 failures / unreachable host (UDP), instead of immediate disconnection on first errorCleanupClientsetsIsPlaying = falsebefore callingDispose()to preventObjectDisposedExceptionin streaming tasks that may still be running on separate threadsHandleClientusesReadLineAsync()null detection instead ofSocket.Connectedto detect disconnection, avoiding false positives from the unreliableConnectedproperty- Frame dequeue uses a 2-second timeout to prevent blocking forever on encoder stalls
FramePacer.ShouldDropFramefixed: now correctly drops frames arriving too fast (less than half a frame interval), not frames arriving after a gap β the previous inverted logic caused recovery from stalls to be even slower
Encoding Loop Reorder:
Problem: The encoding loop fed input first, then drained output. When the encoder's internal input queue was full (because output hadn't been drained),
FeedInputBufferwould fail and the frame was lost.Fix: Swapped the order in
EncodingLoopto drain output before feeding input. This frees encoder resources before attempting to queue new input, reducing unnecessary frame loss on resource-constrained SoCs.Files Changed:
RTSP/H264Encoder.cs: I-frame interval fix (SetFloatβSetInteger), encoding loop drain-before-feed reorderRTSP/Transport/RtpPacketBuilder.cs: Wall-clockStopwatch-based RTP timestamp derivationRTSP/Transport/TransportManager.cs: Graduated error counting (10 threshold for TCP, 5 for UDP), SendLock timeout loggingRTSP/Streaming/StreamingController.cs: 2-second frame dequeue timeout,ObjectDisposedExceptionandChannelClosedExceptionhandlingRTSP/Streaming/FramePacer.cs: Inverted drop logic fix (drops fast frames, not slow ones)RTSP/ClientManagement/ClientManager.cs: Safe cleanup ordering (IsPlaying = falsebeforeDispose()),IsPlaying-first dead client checkRTSP/Server.cs:HandleClientsocket lifecycle fix usingReadLineAsyncnull detection
Debugging Methodology:
This fix was identified through a systematic "debug mode" approach:
- Disabled all lifecycle management (WatchDog, RTCP cleanup, HandleClient socket closing) to isolate the actual streaming issue
- Added frame counter logging to track frame flow through the entire pipeline (camera β encoder β channel β streaming controller β RTP β transport)
- Discovered all-IDR output from encoder logs (
key=Trueon every frame) - After I-frame fix, added detailed NAL diagnostic logging (NAL type, size, encoder timestamp, RTP timestamp, SPS/PPS info)
- Discovered RTP timestamp delta of ~3,000,000 instead of expected ~3,600
- Applied wall-clock timestamp fix β stream became fluid
Performance Metrics:
Metric Before After H.264 Stream Duration ~30 seconds then freeze Continuous, unlimited Encoder Output Stall after ~1000 frames Continuous encoding RTP Timestamp Delta ~3,000,000 (833x too large) ~3,600 (correct) Client Reconnection Frequent false disconnections Stable with graduated error tolerance Frame Recovery After Stall Slow (drops first frames) Immediate (drops only bursts) - Impact: H.264 RTSP streaming now runs continuously without freezing on MediaTek MT6768 and likely other MediaTek SoCs that share these encoder quirks. The combination of correct I-frame interval configuration, robust RTP timestamp derivation, and hardened client lifecycle management eliminates the three root causes of the freeze. Transient network errors no longer kill the stream, and the encoder no longer stalls from all-IDR output.
Thanks for checking out Balu Media Server!
Feel free to report bugs, suggest features, or fork and play around with the code.
Let's make mobile RTSP with C# a thing! πͺ
Made with β€οΈ and C# β’ Open Source β’ MIT Licensed
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net9.0-android35.0 is compatible. net10.0-android was computed. |
-
net9.0-android35.0
- Microsoft.Maui.Controls (>= 9.0.51)
- Microsoft.Maui.Essentials (>= 9.0.120)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
v1.5.7: Configurable Video Resolution - Added VideoResolution enum with presets (QVGA to Full HD), resolution configuration in ServerConfiguration and Server constructor, H.264 encoder bitrate recommendations per resolution