You’re standing in the middle of your meticulously crafted world. Your models look dynamite. Your environment is rich and dynamic. The ambiance is palpable. You’re ready to share it, get feedback, and start collaborating to take it across the finish line.
As you begin the live stream, the wind in your character’s hair looks so real you can almost feel it. In fact, you can feel it—as the sound of your computer turning into a jet engine brings you back to reality. Immediately, you know no amount of optimizing shading, lighting, mipmaps, and LODs can save you from the inevitable pains of trying to live stream Unreal Engine without lag.
Unfortunately, the issues inherent to streaming CPU/GPU-heavy applications are broad and plentiful, and Unreal Engine ranks near the top of the list of sometimes challenging software programs. But you have options for streaming, sharing, and collaborating on projects with your team and clients alike.
To help you find the best solution, here are the most popular workarounds to live stream Unreal Engine without lag, ensuring a smooth editing session.
Pixel streaming
As Unreal Engine’s native streaming tool, pixel streaming is easily one of the most popular and accessible choices. While it offers a healthy slate of benefits, it has enough drawbacks not to consider it the top option.
Pros
- High-quality graphics: Since rendering happens on high-performance servers, the streamed content can be as graphically intense as the hardware allows.
- Device agnostic: Access it through any device with a browser and internet connection.
- Low latency: Barring one other option on this list, pixel streaming offers some of the lowest latency experiences available—perfect for real-time collaboration.
- Advanced controls: This option allows users to access the front end for live interaction, plus adjust graphics settings to match server capabilities.
Cons
- High bandwidth requirement: It requires significant bandwidth for both client and servers, especially for high-frame-rate content, resulting in lower quality/higher latency.
- High server costs: This option needs cloud servers with powerful GPUs capable of rendering content in real time, which can become costly for large-scale projects and high user counts.
- Scalability issues: Input lag and latency can compound quickly and cause inconsistent experiences between users depending on the number of users, server instances needed, and content intensity.
- Quality loss: Users can experience a loss in quality due to video compression, which can be an issue when working on fine details.
- Lack of comms: Without built-in tools for communication, you’ll still need to be on the phone or integrate a video chat app into your system to communicate with clients and team members.
Evercast
Designed for production and development professionals whose work regularly relies on quality and accuracy, Evercast is a comprehensive, all-in-one solution. For those looking to stream their workflows in real time, Evercast can save you from juggling multiple tools or trying to utilize complex workarounds.
Its myriad advantages place it atop our list of solutions for streaming Unreal Engine without lag. Some of its biggest differentiators are a helpful suite of collaboration tools (like annotations and recording), a 4K, 60fps, color-accurate feed, and built-in video conferencing.
Plus, it’s the solution of choice for accomplished industry pros.
“Basically every component of our development team uses Evercast, whether they’re reviewing their work in builds or having meetings to review critical pieces of work like animations or things you can’t really capture in a screenshot.” — Parker Davis, game development director at Serenity Forge
Pros
- All-in-one solution: Unlike pixel streaming, Evercast isn’t limited to Unreal Engine, allowing you to stream from any source—and even multiple sources simultaneously. Users can easily toggle between Unreal and any other software, media files, live camera feeds, etc., so team members and stakeholders can watch and weigh in—all in one place.
- Stream Unreal’s virtual cameras: Beyond Unreal’s interface, you can stream its virtual cameras, such as scene cameras, character POVs, etc., by sending the virtual signal out via NDI to the Evercast app. You can even stream multiple cameras concurrently and switch between them for the ultimate perspective on your project’s environment.
- Thoughtful tools: Instead of balancing multiple applications to run your session, Evercast has built-in tools to unify your efforts, including built-in video conferencing, session recording, and comments/annotations. This way, your collaboration is seamless and intuitive—even after the live session.
- Trust and security: Approved by all major studios worldwide and a member of the Motion Picture Association’s Trusted Partner Network (TPN) for content security, Evercast offers 24/7 threat monitoring, watermarking, multi-factor authentication, and encrypted streaming to keep proprietary content safely guarded.
- Comprehensive support: Users enjoy 24/7 on-demand tech support for even the most complex workflows and setups.
Cons
- Cost: Designed for high-level professionals, Evercast has a higher price point than consumer-grade tools. However, flexible pricing options are available on a project-to-project basis.
- Overkill for casual projects: While perfect for pro-level projects, the cost-to-feature ratio may be less optimal for those underutilizing its comprehensive abilities.
One late note: Evercast is an ever-evolving solution, and its integration with Unreal is only getting more advanced. So, keep an eye out for updates as the platform’s features and capabilities continue expanding.
Learn more about Evercast here, or book a personalized demo for your next editing session.
Create together remotely, in real time
OBS + NDI
Instead of using the “all-in-one” approach like Evercast, you can divide your resources independently before combining them to produce your video stream. You can do this in several ways, including using an NDI (Network Device Interface) with an OBS (Open Broadcaster Software).
This approach requires two devices: one dedicated to operating Unreal and the other to video conferencing. Sending the feed from the former to the latter allows you to produce a high-quality video stream without compromising either device’s performance.
Pros
- Enhanced control: You gain better control of your session’s audio and video outputs beyond what other Zoom-like platforms allow, offering more and higher-quality streaming options.
- Easy integration: Since this approach integrates directly into Zoom, you can easily collaborate with others on a platform they’re likely already familiar with.
Cons
- Time-consuming setup: It’s a bit laborious on the front end, requiring downloading plugins, configuration tools, and platform-specific screencast software, as well as adjusting the settings of each to get it all dialed in.
- Performance limitations: While server capabilities can limit pixel streaming and the Evercast platform, this option is limited by the capabilities of virtual camera setups, which can be significant on high-level projects.
- Additional points of failure: Using various plugins, tools, and software could introduce more points of failure, potentially compromising the workflow at any stage.
- Latency drawbacks: Because this option depends on a local network to filter through OBS and external conferencing tools, a lack of optimization often creates unnecessary latency for CPU-heavy workflows.
For a more comprehensive breakdown of this method, check out our article on using NDI for streaming.
Zoom + NDI
By combining Zoom Rooms with the advantages of an NDI’s resource allocation benefits, you can achieve a higher-quality video stream than Zoom alone, allowing for more effective editing, collaborating, and conferencing.
Pros
- High-quality video: You get better video quality and lower latency (up to 1080p) than Zoom alone.
- More options: This option offers greater control over streaming option settings, giving you more flexibility than Zoom screen sharing alone.
Cons
- Restrictive requirements: This alternative requires meeting Zoom-based prerequisites, such as an up-to-date Zoom Rooms license, a Zoom Rooms controller, compatible cameras, the NDI itself, and other tools that support NDI.
- Additional setup and maintenance: Like the OBS + NDI approach, this option requires changing NDI and support tool settings, along with those in the Zoom Web Portal.
- Lack of source flexibility: Users have to choose between streaming the NDI or their face, which can often compromise communication.
- Limited resolution: This option maxes out at 1080p, which can be quite limiting for working professionals.
- Limited audio controls: Not having control over the audio quality—a key component for games—can make some projects completely untenable.
For a comprehensive explanation of this approach, check out our article on how to use Zoom Rooms + NDI for streaming.
Zoom + OBS
Given that OBS is free and open-source, it’s one of the cheapest and most accessible options available. Unfortunately, its drawbacks tend to outweigh its benefits.
Pros
- A modest improvement: It offers better performance and quality than a standard Zoom meeting.
- Cost-effective: It’s easily the most affordable option.
- Compatibility: It works with Windows, Mac, and Linux operating systems.
Cons
- Setup required: It’s not plug and play—you still need to set up, configure, and consistently update both the OBS and Zoom.
- Outdated: The already outdated OBS user interface takes time to learn, decreasing its utility with time.
- Not optimized: This option demands considerable CPU power for OBS—which is also prone to lag—making it potentially problematic in a live collaborative setting.
- Lack of features: Providing and collecting feedback requires additional tools and stopping/starting playback for note-taking.
- Not failsafe: Stretched between users, OBS, and Zoom, this workflow often has copious points of failure.
For a more comprehensive understanding of this method, check out our article on using OBS with Zoom for streaming.