Innovation and streaming video: Three takeaways from the IBC Accelerator Kickstart Day
AI, the future of news amid innovation technology, edge computing for live events: here are the main learnings from the event held at IET London, on 6th March.
The 2024 edition of the IBC Accelerator Kickstart Day took place in London as the annual get-together for media brands and technology organisations looking to learn more about the framework to support fast-track collaboration and innovation. An ideal platform for thought leaders and decision-makers to meet and spark conversation, the IBC Accelerator Kickstart Day was an occasion for all participants to identify, address, and solve business and technology challenges as per our industry landscape today.
Our team attended the event, listening to several panel discussions and pitches that went from innovation technology in the broadcast space to the ever-so-pervasive use of Artificial Intelligence (AI) in the video sector.
In this blog post, we round up the three main themes discussed during the one-day event at IET London, focusing on the much-debated topics of AI, the future of broadcast news, and edge computing applied to live events production. Read on.
AI and content creation
Generative AI (GenAI) in media production has made the headlines in the past couple of years as an emerging trend among content producers. Undoubtedly, the innovative approach is gaining more and more relevance, as discussed by Anthony Guarino, Paramount’s EVP Global Production and Studio Tech. Guarino emphasised the need for creatives to be disruptive in guiding AI models and enhancing the prompt game to leverage GenAI effectively. How to do that? One of the keys seems to be the ability to become more descriptive when informing the models, including shot lists, ambiences, and sequencing, before full use of GenAI’s capabilities can be made.
Among the examples brought to the attention of the audience is Sora, the (still unreleased) OpenAI model that promises to create realistic and imaginative scenes from simple text instructions. Available today to red teamers to assess critical areas for harm or risks, Sora is expected to be able to generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background. Furthermore, the model is set to understand what the user asks for in the prompt, as well as how those things exist in the physical world.
The interest for GenAI is not confined to big content creators in the streaming space. Public Broadcaster Services (PSBs) have already understood the importance of this innovative aspect for the years to come. Steve Belford, Enterprise Innovation Architect at Channel 4 reminded the audience – as part of the Champions Roundtable – that Toy Story used innovative technology for the time (back in 1996, Pixar Studios) to produce a new standard for animation. The industry is now looking for a ‘Toy Story Moment for GenAI’ in terms of a creative breakthrough in its application.
💡 Learn more:
BBC sets out plans for incorporating GenAI
Channel 4: Innovating and experimenting with AI
The future of news production and journalism
AI has been at the centre of media attention for quite a while now for its ‘darkest’ aspects, namely deepfakes. Defined as synthetic media, deepfakes are digitally manipulated visuals that replace one person's likeness convincingly with that of another. They are generated through the manipulation of facial appearance through deep generative methods and leverage powerful AI techniques to generate visual and audio content that can deceive the public.
Needless to say, the spreading of fake information (and, often, hate speech) through deepfakes is being taken very seriously by the news industry, especially with the upcoming election year. The existential threat posed by this particularly sinister format of misinformation could have real-world effects on the outcomes of elections.
The risk posed by deepfake media was highlighted by Claudia Milke, SVP of Standards and Procedures at CBS, who also called for a collaborative approach within the news broadcasting sector. Tackling misinformation cannot be done in isolation, even though CBS News is reportedly working on the development of a news authenticator to filter out deepfakes.
Edge computing and live events
The BBC and BT took the stage to present an accelerator pitch focusing on the use of edge computing for live events with limited bandwidth. The pitch featured a live demo presented by Ian Wagdin, Senior Technology Transfer Manager at the BBC, and John Ellerton, Head of Future at BT Media & Broadcast.
The focal idea behind the project is that live events could be covered in the near future with an agile and light team, therefore making the presence of broadcast vans for satellite uplink unnecessary. The concept is transmission of the information via the internet available in raw format, subsequently processed by dedicated, cloud-based tools, for the optimisation of grading, audio, and other aspects of the final output.
The topic is of great interest for PSBs such as the BBC, now pivoting to extensively using the cloud for event coverage in real time. And with the added benefit of ‘lighter’ on-prem set-ups. Among the quoted examples were some of the most watched events in the UK in the past 18 months: Queen Elizabeth II’s funeral and King Charles’ coronation, produced with the use of a 5G-powered camera network.