Toronto. Unless you have being living ‘off the grid’, you have heard about artificial intelligence (AI) and deep fakes. Even TV programs have addressed the issue where a ‘deep fake’ is made by using images of a person and possession(s) to be combined by AI into something that never happened. This also happens with stills which are addressed in the following.
A recent article in the IEEE Spectrum by Matthew Smith called, “First camera with built-in content credentials verifies photos’ authenticity” discusses how credentials and hardware (cameras) are beginning to combat this sinister situation for stills. The first camera to offer this protection is the digital Leica M11 shown here.
This reminds me of the old Mad magazine stories about “Spy vs. Spy” where each action sparks a counter action (or today’s battled between hackers and computer counter action).
Our thanks to George Dunbar for spotting and sharing this article on this very modern issue.