AMD At ISE 2023
Advancing AV through adaptive computing.
ISE 2023, the world’s largest AV tradeshow, takes place in Barcelona from Jan 31st to Feb 3rd. This year the AMD Pro AV, Broadcast and Consumer team will be located at Hall 5, Stand 5D300, alongside strategic customers and partners, demonstrating adaptive computing solutions designed to enable real-time 4K and 8K-ready applications; transport any media over any network; and support intelligent AV solutions.
High-Throughput JPEG 2000 Video Codec
Developed by new AMD technology partner Parretto, High-Throughput JPEG 2000 (HTJ2K) is a lightweight video codec that offers lossy and lossless compression to support AV-over-IP. HTJ2K is an open standard being used in networking, contribution, remote production, and cloud-based media workflows, and with no associated royalty fees, offers an alternative option for those needing low-cost and lightweight codec implementations.
Demonstrating Dante audio networking technology, our technology partner Audinate will be highlighting their new Dante Brooklyn 3 module based on the AMD Zynq-7000 SoC. Providing a complete and ready-to-use Dante audio interface, Brooklyn 3 is compatible with more than 3,500 Dante-enabled products and is a pin-compatible replacement for Brooklyn 2 modules.
Smart Door Lock
Presented by Makarena Labs, the smart lock demo will show machine-learning for facial recognition and lock control running on the AMD Kria SOM and edge AI products. The demo allows users to look into a camera, even while wearing facial accessories, and automatically open a mechanical lock if they are registered for authorised access. The design is aimed at both residential and commercial smart access and can be expanded to support other AI models for gathering building use metrics.
AMD and NDI will demonstrate various suppliers of interoperating equipment in an NDI AV-over-IP network, and for the first time introduce NDI running on Kria SOM (system on module) technology. Building NDI on AMD FPGAs and SoCs offers high resolution, low latency encoding, and the addition of a Kria SOM implementation can help reduce the time-to-market to add NDI capabilities into your next design.
Raritan will debut its new KVM-over-IP encoder, supported by a software decoder. This demo will highlight the maturity and interoperability of the Zynq UltraScale+ EV MPSoC’s H.264/H.265 video codec, encoding 4K60 video from an HDMI input, and including USB support for keyboard and mouse control, as well as the low latency encoding capability of the full KVM stack.
AMD technology partners Adeas and Nextera Video, working with Matrox, Village Island and Megapixel VR, will showcase IPMX AV-over-IP and interoperability between various vendors in the IPMX network. Adeas and Nextera Video will present their IPMX IP core running on both a Zynq UltraScale+ FPGA platform and the Kria SOM. These will be senders and receivers in an IPMX network also containing a ConvertIP transmitter from Matrox, a JPEG XS video sender from Village Island and a Megapixel VR LED video wall driver. All of this will be controlled by Matrox ConductIP NMOS signal routing software. IPMX is a truly open, vendor independent and scalable AV-over-IP technology built on a SMPTE ST 2110 foundation, so it can bridge the worlds of broadcast and Pro AV.
Utilising the expanded capabilities of the AMD Zynq UltraScale+ MPSoC, with its H.264/H.265 Video Codec Unit (VCU); the Osprey Talon 4K-SC Contribution Encoder will ingest 12G-SDI and HDMI 2.0 video along with all accompanying metadata and up to 8 stereo audio pairs. The input is scaled and compressed for delivery via modern IP transports including RTMP, TS over UDP/RDP, SRT, Zixi or WebRTC (up to 4:2:2 10 bit). The Osprey Talon-SCD Decoder, using most of the same components, receives the IP stream and converts it back into SDI and HDMI for reintroduction into the video production signal chain with all initial metadata intact. The Osprey Talon family is ideal for live contribution and ground-to-cloud streaming applications.
4K Windowing & Tracking
Developed on the Zynq UltraScale+, using Vitis Video Analytics SDK (VVAS), our 4K windowing and tracking application can be utilized for a variety of live events, while minding the cost related to equipment, staff, and setup. An integrated single-chip system design using ML at the edge, receiving input from a 4K camera, can track multiple objects within the wider-angle view and crop the footage in real time to multiple HD windows. These cropped sections are then output to a 4K display, providing more interesting camera shots of the key events and subjects without needing more cameras.