Video: Panel Discussion: Hardware is Dead!?

The broadcast industry is still producing many new hardware-based products with FPGAs and encoding ASICs still ruling the roost for many companies when it comes to fitting video products into small, power efficient spaces. But the battle continues as software-based products continue to ramp up, server-based products continue to improve and the need to be able to virtualise or place functions into the cloud drives the desire for software-based solutions.

We all know that hardware isn’t dead and that the interest of the topic is where we are today, what is possible and why people are choosing this route and that’s what Broadcast Solutions’ panel discusses in this video. Often called COTS – commercial off-the shelf – hardware, the idea is that you can buy the same server that any other industry does and run your broadcast-related functions on it. When it’s in the cloud, you’re not even selecting the hardware as much as saying how many CPUs and other resources you’d like.

The first comments made come from Marcel Koustaal from Grass Valley who feels that the industry doesn’t entirely appreciate the value software as it’s less tangible than hardware but Pierre Mestrez from Simplylive makes the point that creating products quickly in a modular way is an important part of that company’s success. Zero Density makes the point that they can work quickly as they can build their software on top of other software, Unreal Engine, for example.

Troubleshooting changes for those who run of the systems, we hear from Laurent Petit from EVS. It takes a different set of thinking and processes compared to the idea of swapping a card. The transition to IP, adds Marcel, creates a training opportunity where the technology and the workflows are changing at the same time.

Kuban Altan compares the ability with audio to be processed in real time, easily, by CPUs, by consumer laptops with the future of video processing. Whilst now it’s not so easy to process video with CPUs at the moment, this will change over the coming decade as CPUs improve significantly. Moreover, Kuban looks towards a day where IO is reduced between devices and rather stays within the same CPU/GPU.

The move to software is a global trend, states Laurent, partly because of the imperative to work quickly and efficiently in our small industry whereby we can benefit by building on software developed for similar uses in other industries. The move will take time, however explains Marcel, and will take longer than bringing online the technology itself.

The video ends with a discussion of how clearly hardware-bound devices such as cameras can still embrace software in order, in the future, to create lighter, more flexible cameras which will improve the range of what you can do with each camera and, ultimately, enhance the creative options available to programme makers.

Watch now!
Speakers

Kuban Altan Kuban Altan
Vice President Research and Development,
Zero Density
Marcel Koutstaal Marcel Koutstaal
Senior Vice President and General Manager of Camera Product Group,
Grass Valley
Pierre Mestrez Pierre Mestrez
VP Pre-Sales & Channel Partners,
Simplylive
Laurent Petit Laurent Petit
SVP Product,
EVS

Video: The next enhancement for RIST

Continuing the look at RIST, the developing protocol which allows for reliable streaming over the internet – even in the event of packet loss, we have a look at a key feature on the roadmap.

The core proposition of RIST is to produce an interoperable protocol which brings the internet into the list of ways to contribute and distribute low-latency video. It’s resilient to packet loss due to it’s ability to re-request packets which have been lost yet is light enough for video streaming. In another talk at IBC, we learn about the latest developments which have added security and many other features to the list of capabilities.

Here, Adi Rozenberg from VideoFlow explains how this will further be extended by upcoming work to allow the source stream to reduce in bitrate in response to reduced capacity in the network. With RIST’s ARQ – the technology which requests missing packets – we find that the retransmissions can actually aggravate bitrate constrictions particularly when they are permanent. Adi proposes the only real way to solve lack of bandwidth issues is to reduce the bitrate of the source.

RIST already includes NULL packet removal so that NULL packets aren’t transmitted and are re-inserted at the remote end. This is usually a great start in reducing the bitrate of the stream. However more is needed, we need a way to tell the encoder to reduce the bandwidth of the video stream itself. This can be accomplished by RTCP.

Adi identifies the problem of identifying when extra bandwidth has returned as a reduction of bandwidth is quickly and easily signalled with retransmissions, but excess bandwidth silently returns. The system gradually increases the encoder bandwidth to always be probing the current balance of bandwidth and bitrate.

This works well when there is a single encoder and a single decoder. When there are multiple decoders, life is more difficult. The solution offered to this is to create a ladder of bitrates all of which are adaptable. Now the destination can switch between profiles. This can be extended to MPTS (Multi-Program Transport Streams) whereby, depending on the destination, services in the MPTS are dropped in order to recover bandwidth. A mechanism is used which prioritises services depending on the destination (i.e. German channels are de-prioritised on delivery to France).

The session ends with a Q&A on stream switching details and use in stat mixing.

Watch now!
Speakers

Adi Rozenberg Adi Rozenberg
CTO,
VideoFlow

Video: ATSC 3.0 – What You Need to Know

ATSC 3.0 is the next sea change in North American broadcasting, shared with South Korea, Mexico and other locations. Depending on your viewpoint, this could be as fundamental as the move to digital lockstep with the move to HD programming all those years ago. ATSC 3.0 takes terrestrial broadcasting in to the IP world enabling traditional broadcast to be mixed with internet-based video, entertainment and services as part of one, seamless, experience.

ATSC 3.0 is gaining traction in the US and some other countries as a way to deliver digital video within a single traditional VHF channel – and with the latest 3.0 version, this actually moves to broadcasting IP packets over the air.

Now ready for deployment, in the US ATSC 3.0 is now at a turning point. With a number of successful trials under its belt, it’s now time for the real deployments to start. In this panel discussion as part from TV Technology looks at the groups of stations working together to deploy the standard.

The ‘Transition Guide‘ document is one of the first topics as this video tackles. With minimum in technical detail, this document explains how ATSC 3.0 is intended to work in terms of spectrum, regulatory matters and its technical features and makeup. We then have a chance to see the ‘NextGenTV’ logo released in September for equipment which is confirmed compliant with ATSC 3.0.

ATSC 3.0 is a suite of standards and work is still ongoing. There are 27 standards completed or progress ranging from the basic system itself to captions to signalling. A lot of work is going in to replicating features of the current broadcast systems like full implementation of the early alert system (EAS) and similar elements.

It’s well known that Phoenix Arizona is a test bed for ATSC and next we hear an update on the group of 12 stations which are participating in the adoption of the standard, sharing experiences and results with the industry. We see that they are carrying out trial broadcasts at the moment and will be moving into further testing, including with SFNs (Single Frequency Networks) come 2020. We then see an example timeframe showing an estimated 8-12 months needed to launch a market.

The video approaches the end by looking at case studies with WKAR and ARK multicasting, answering questions such as when will next-gen audio be available, the benefit of SFNs and how it would work with 5G and a look at deploying immersive audio.

Watch now!
Speakers

Pete Sockett Pete Sockett
Director of Engineering & Operations,
WRAL-TV, Raleigh
Mark Aitken Mark Aitken
Senior VP of Advanced Technology, Sinclair Broadcast Group
President of ONE Media 3.0
Dave Folsom Dave Folsom
Consultant,
Pearl TV
Lynn Claudy Lynn Claudy
Chairman of the ATSC board
Senior VP, Technology at NAB
Tom Butts Tom Butts
Content Director,
TV Technology

Video: Wide Area Facilities Interconnect with SMPTE ST 2110

Adoption of SMPTE’s 2110 suite of standards for transport of professional media is increasing with broadcasters increasingly choosing it for use within their broadcast facility. Andy Rayner takes the stage at SMPTE 2019 to discuss the work being undertaken to manage using ST 2110 between facilities. In order to do this, he looks at how to manage the data out of the facility, the potential use of JPEG-XS, timing and control.

Long established practices of using path protection and FEC are already catered for with ST 2022-7 for seamless path protection and ST 2022-5. New to 2110 is the ability to send the separate essences bundled together in a virtual trunk. This has the benefit of avoiding streams being split up during transport and hence potentially suffering different delays. It also helps with FEC efficiency and allows transport of other types of traffic.

Timing is key for ST 2110 which is why it natively uses Precision Timing Protocol, PTP which has been formalised for use in broadcast under ST 2059. Andy highlights the problem of reconciling timing at the far end but also the ‘missed opportunity’ that the timing will usually get regenerated therefore the time of media ingest is lost. This may change over the next year.

The creation of ST 2110-22 includes, for the first time, compressed media into ST 2110. Andy mentions that JPEG XS can be used – and is already being deployed. Control is the next topic with Andy focussing on the secure sharing of NMOS IS-04 & 05 between facilities covering registration, control and the security needed.

The talk ends with questions on FEC Latency, RIST and potential downsides of GRE trunking.

Watch now!
Speaker

Andy Rayner Andy Rayner
Chief Technologist,
Nevion