We facilitate students and connect them to industry, giving them tangible experience within the media field. We supply students and recent graduates with the opportunity to work with industry-grade equipment for real clients, while ensuring that the content is of the highest standards, before and during delivery. In addition, we also pursue research into emerging technologies, pushing the capabilities of the media industry and preparing students in an extra-curricular mindset.
We are a Digital Services company that specialises in the facilitation of multi-skilled and energetic new talent directly into industry. We recognise true professionals and hard workers, and we do everything we can to create an environment that ensures their skills are nurtured and they are given the best opportunities to shine. We are never short of fresh, highly skilled new talent who want to show us what they can do. We create environments where they’re permitted to do this on our range of creative projects.
Through our connections with industry, we help young talent engage with media professionals, allowing both to flourish in a driven environment.
We try to engage with the media in different ways by challenging traditional media norms and telling stories in a variety of methods.
Badger & Combes have worked with industry partners and business units, including Salford University to demonstrate significant cost saving when it comes to high quality media production. In doing this, Badger & Combes has also delivered first-hand industry experience to over 300 students, both at UG and PG levels.
· Camera operation (Sony FS7/ HCX 100 camera channels , Canon, Panasonic 301/500, range of broadcast cameras and DSLRs)
· Video Editing (Avid, Premiere Pro, After Effects, Davinci Resolve, Final Cut)
· 360⁰ video content for VR/AR application
· Live streaming capabilities for multiple platforms
· Event Coverage
· Promotional Content
· Drone & Aerial Footage
· Documentary Production
· Research & Development
Tailoring content around media consumers
Object-Based Media (OBM) is an area of research based around using AI algorithms to construct content and giving the user a personalised experience. This is achieved by every component of a piece of content (each clip, audio, music, graphics etc) being assembled separately, allowing content to be personalised to a specific audience.
This would affect every aspect of the final product, from the language, to the visual aesthetics, the length, the tone and even the aspect ratio that the content is shown in.
In order to make this broadcast method viable, however, there would need to be a shift in consumer behaviour as well as a change in production methods to create the most expedient ways of producing object-based content.
Other institutions are also conducting research into OBM technologies and techniques. We have developed a research platform that allows us to experiment with OBM workflows. This is our web-based tool, Media Framework.
Media Framework, in its current state, allows for the semantic tagging of media assets which can then be systematically called by setting parameters.
This, in essence, acts as a back end for OBM, allowing the tagging and calling of media assets through semantic tagging.
The aim for this tool would eventually be to make it commercially viable, used by companies as a conduit through which to organise and broadcast their content.
You can learn more about OBM through the BBC's research here.
Whilst at the Studio for International Media & Technology we developed a research platform that allows us to experiment with OBM workflows. This is our web-based tool, Media Framework.
Media Framework, in its current state, allows for the semantic tagging of media assets which can then be systematically called by setting parameters. This, in essence, acts as a back end for OBM, allowing the tagging and calling of media assets through semantic tagging. The aim for this tool would eventually be to make it commercially viable, used by companies as a conduit through which to organise and broadcast their object-based content
In order to showcase this, we built a front end to the tool in the form of a musical performance. The piece, titled Five Elements of Living Treasure, is a musical/video performance based around the work of Korean ceramic artist, Gyung Kyun Shin. The concept and music were composed by Professor Insook Choi.
The interface works via a touchscreen interface, allowing the performer to move 'nodes' around a 'well' which are linked to specific themes within Media Framework. These themes are linked semantically to the types of audio and visual content that will be displayed, making each performance interactive and triggering slightly different media with every performance.
As well as this new method of calling media, the performance also used video analysis to extract sound from the data being produced using a sonification process. The performance was also taken to a computer music conference in Daegu, South Korea, as part of the International Computer Music Conference in August 2018.
After this, the performance was adapted into an interactive gallery installation at Castlefield Gallery in Manchester, UK. This allowed the users to explore all aspects of the ceramic production process as well as experiment with mixing sounds, much in the style of the performance interface.
To read more about the performance piece and installation, visit Professor Choi's website here.