The Film Look in Black and White — Film, CCD and CMOS Compared
"CCD LOOK" or "FILM LOOK" — looking through my website data analytics these two search terms come up a lot, and over the last year or so if you're a YouTube watcher, you'd probably have noticed those topics come up quite a bit also, so what is "CCD LOOK" and "FILM LOOK"? Let's start off with asking Google AI mode, here's the answer:
[ 'CCD look' refers to the unique, nostalgic aesthetic produced by older digital cameras using Charge-Coupled Device (CCD) sensors, often described as having a film-like quality. It is characterised by vibrant, organic colors, smoother tonal transitions, and a softer "milky" highlight roll-off compared to the sharper, more clinical results of modern CMOS sensors. ]
So in a nutshell, getting a digital image that replicates the look of film — excellent, blog over. Only joking. Is it possible? Let's take a look. Down below in no particular order are three images — one taken on film, Ilford Delta 400, one taken on my Sony A350, a 2008 crop sensor camera with a CCD sensor, and just for fun, one more image taken on my Sony A700, a 2007 crop sensor camera with an early CMOS sensor. All edited in my usual style. If you're a regular to my site then you will know I only shoot in black and white JPEG, so I can't offer any colour references — this is a straight up comparison in black and white. I've not labelled the images so see if you can tell which is which. We'll have a look at the different mediums first, then I'll reveal which are which at the end.
(Click on the images to enlarge)
Film
Before digital existed, film was all there was. Every camera from a disposable throwaway to a professional Hasselblad was, at its most basic, a light-tight box with a hole in it. The lens focused the light, the shutter controlled how long it was let in, and the film did the rest. The camera itself was largely irrelevant — the film was your sensor, your processing engine and your final image all in one.
Film works through a chemical reaction. Silver halide crystals suspended in a gelatin layer on the film base react when exposed to light, forming what's known as a latent image — invisible until the film is developed. The size and distribution of those crystals determines the character of the film. Faster films like Ilford Delta 400 use larger crystals to gather more light, and it's those crystals that give film its grain — that organic, textured quality that people have been trying to replicate digitally ever since.
Delta 400 is a modern tabular grain film (T-grain), meaning the crystals are flat and uniform rather than random and chunky. The result is finer grain than you'd expect from a 400 ISO film, good shadow detail and — crucially — a highlight roll-off that digital has always struggled to match. Film doesn't clip highlights suddenly the way a digital sensor does. It rolls off gradually, retaining detail in bright areas in a way that feels natural to the eye. You can push Delta 400 hard and it holds together. That's not an accident, that's chemistry.
There's also something in the way film responds to light that goes beyond technical explanation. The grain isn't just noise — it's organic, it's part of the image, it breathes with it. Two frames shot on the same roll in the same light will never be identical at a grain level, and that subtle unpredictability gives film a quality that feels alive in a way that even the best digital can struggle to replicate. Whether that makes it better is a matter of taste. But it does make it different — and that difference is exactly what this blog is about.
CCD — Charge-Coupled Device
The CCD sensor was the technology that made digital photography viable. Developed in 1969 by Willard Boyle and George Smith at Bell Labs — originally as a memory storage device rather than an imaging tool — it became the backbone of digital cameras from the early 1990s right through to the late 2000s. If you picked up a digital camera before roughly 2010, chances are it had a CCD sensor inside it.
The way a CCD works is relatively straightforward. Light hits the sensor, which is covered in millions of tiny light-sensitive photosites. Each photosite collects the light that falls on it and converts it into an electrical charge. Those charges are then moved across the sensor in sequence — like a bucket brigade — and read off at the edge before being converted into the image data that becomes your photograph. It's an elegant system, but it's also power hungry, slow to read and expensive to manufacture at large sizes. Those limitations are ultimately what ended it.
By the early 2010s CMOS technology had caught up and then overtaken CCD in almost every measurable way — faster readout speeds, lower power consumption, cheaper to produce and better high ISO performance. Camera manufacturers moved on quickly and CCD sensors largely disappeared from mainstream cameras. Sony, Canon, Nikon — they all made the switch. The CCD era was effectively over within a decade of digital photography going mainstream.
So why do people talk about the CCD look with such affection? A lot of it comes down to colour. CCD sensors produced colours that were rich, saturated and organic — particularly in skin tones and natural greens (This is entirely subjective and not technically proven). The highlight roll-off was gentler than early CMOS, the tonal transitions smoother, and the overall rendering had a warmth that felt closer to film than the sharper, cooler, more clinical output of modern CMOS sensors. It wasn't necessarily more accurate — but it felt more natural, and that distinction matters to photographers.
If your a black and white shooter like myself the colour argument is irrelevant — but the tonal qualities are not. The smoother highlight roll-off that CCD sensors produce translates directly into black and white as a more gradual transition from bright areas to white, retaining detail in skies and high contrast scenes in a way that feels less digital. The midtone rendering has a depth and subtlety that suits landscape and documentary work particularly well. Grain and noise at higher ISOs on a CCD also behaves differently to CMOS — it tends to be finer, more evenly distributed and visually closer to film grain than the chunkier, more random noise pattern of early CMOS sensors. For someone shooting black and white JPEG straight out of camera with no RAW safety net, that behaviour counts.
CMOS — Complementary Metal-Oxide Semiconductor
CMOS technology actually predates CCD in terms of its origins — the underlying semiconductor technology had been around since the 1960s. But for a long time it was considered unsuitable for imaging. Early CMOS sensors were noisy, inconsistent and produced images that couldn't match CCD quality. It took until the late 1990s and early 2000s for engineers to refine the technology to the point where it became a serious alternative, and by the mid 2000s it was beginning to appear in mainstream consumer cameras.
The fundamental difference in how CMOS works compared to CCD is in how the image data is read. Rather than moving charges across the sensor in sequence, each photosite on a CMOS sensor has its own individual readout circuit built directly into it. This means the data can be read much faster, the power consumption is dramatically lower and the manufacturing process is cheaper — CMOS chips can be produced on the same production lines used for computer processors. Those advantages made it inevitable that CMOS would eventually win.
The Sony A700, released in 2007, sits right at the beginning of that transition. It uses an early CMOS sensor at 12.24 megapixels — capable and competent, but still carrying some of the characteristics of that early generation of the technology. The noise at higher ISOs has a different quality to CCD — slightly chunkier, less organic, more obviously digital in character. The tonal transitions, particularly in highlights, are a little more abrupt. Not dramatically so, but enough that shooters who had grown up with CCD or film noticed the difference. The A700 is an excellent camera — but it sits in an interesting transitional period where CMOS was good enough to replace CCD commercially without yet having fully matched it aesthetically.
Fast forward to today and modern CMOS sensors are a completely different proposition. A current full frame sensor from Sony, Canon or Nikon will resolve 45, 50, even 60 megapixels with dynamic range that leaves both early CMOS and CCD in the shade. Highlight recovery in RAW files from a modern sensor is extraordinary — you can pull back several stops of overexposed sky and recover usable detail. High ISO performance that would have been unthinkable ten years ago is now routine. In pure technical terms, a modern CMOS sensor is objectively better than anything that came before it in every measurable way.
But technical superiority and aesthetic appeal are not always the same thing. The very qualities that make modern sensors so capable — the clinical accuracy, the extreme resolution, the smooth noiseless output — are precisely what some photographers find cold and characterless. At 50 megapixels every flaw in a lens is visible, every slight camera shake is punishing, and the sheer volume of information in an image can feel overwhelming rather than immersive. The pursuit of perfection has produced sensors that are almost too good, and that's partly why photographers are looking backwards at CCD and film with renewed interest. Sometimes imperfection is the point.
Conclusion — Can Digital Get the Film Look?
It's the question the entire blog has been building towards, and the honest answer is — almost, but not quite.
The desire to replicate film in digital photography has driven an entire industry. Fujifilm have built their reputation on it, with in-camera film simulations — Provia, Velvia, Acros — that genuinely do a remarkable job of emulating the look and feel of their film counterparts. Other manufacturers allow custom recipes to be installed directly into the camera, altering the JPEG output before the image is even saved. Many cameras now include in-camera grain settings that add a simulated film texture to images, and virtually every piece of post processing software from Lightroom to Capture One offers film emulation tools of varying quality. Some of them are very good indeed.
The CCD sensor sits in an interesting middle ground in all of this. It's still digital — it still captures light through photosites and converts it to data — but its tonal rendering, highlight roll-off and noise characteristics put it closer to the film end of the spectrum than anything modern CMOS technology produces. That's not nostalgia talking, it's physics. The way a CCD handles light is genuinely different, and for black and white work in particular those differences are meaningful rather than marginal.
But here's the thing. Film doesn't just look different — it is different, fundamentally and chemically, at every stage of its existence. The silver halide crystals that form your image, the developer you choose to process it in, the temperature of that developer, the agitation pattern you use, the paper you print on — every variable in the film process contributes to the final image in ways that no algorithm can fully account for because no two rolls of film, no two development sessions and no two darkroom prints are ever exactly alike. That organic unpredictability is baked into the medium itself.
I still shoot film alongside my digital work, and when I edit and present images on this site I want and need some degree of visual cohesion between them. I work hard to achieve that, and I think I get close. But I can still tell the difference between a film image and a digital one in my own work — and if I can tell, having shot and edited both, then the difference is real. Not always dramatic, not always immediately obvious, but real.
So the verdict — yes, you can get close. With a CCD sensor camera you can get closer than with modern CMOS. With careful editing and the right tools you can close the gap further still. But replicate it completely? No. Film is film, and until someone works out how to bottle chemistry and uncertainty into a digital sensor, it's going to stay that way.
Now — how did you get on with the three images at the top of the blog? Did you guess which was which? Here's the answer......
So, from left to right — Image 1 was taken on the Sony A350, a 2008 crop sensor camera with a CCD sensor. Image 2, the bridge shot, was taken on film — Ilford Delta 400, shot on a Voigtländer Bessa, a camera that predates the Second World War. And Image 3 was taken on the Sony A700, a 2007 crop sensor camera with an early CMOS sensor.
A pre-war film camera, a 2007 CMOS sensor and a 2008 CCD sensor — three very different technologies spanning nearly a century of photography, and the question was whether you could tell them apart. Could you? If it was difficult, that probably tells you everything you need to know about the CCD and film look debate. The differences are real — but whether they're visible to the eye in a finished, edited image is a different question entirely. And that's exactly the point.