I'm currently speedrunning analog electronics for the Late Mate project, and one
thing I had to figure out is how much current the photodiode I chose can generate. It happened to be a small rabbit hole
inside the larger rabbit hole.
When used in a sensor, the photodiode acts like a tiny solar cell, generating current. This current
is picked up by a transimpedance amplifier
and transformed into voltage our microcontroller can read with an ADC. The more light hits the sensor,
the more voltage gets to the microcontroller, the higher goes the number that the firmware can read.
To design the amplifier, I need to calculate the maximum current it will receive from the photodiode. The current
depends on the photodiode and the amount of light hitting it. How bright a computer monitor is, really?
Should be easy to find! It's right there!
Except computer monitors are made for humans. This means their brightness is always quoted in "nits" or
"candelas per square metre". Which are units of "illuminance". What is "illuminance", I asked myself.
Next thing I know I stumbled upon a very helpful presentation
explaining that illuminance is "how bright something looks to humans", while "radiance" is "how much radiated energy
hits the surface". A bright ultraviolet source can be very radiant, but not at all luminant!
This means converting between the two is… hard? I need the energy part, but I would need to know what wavelengths
computer monitors emit to calculate the energy from visible brightness. Thankfully, a bunch of smart people
have mapped sunlight radiance to luminosity.
Screens don't really match the sunlight spectrum, but it's close enough for a rough guess.
I hear you asking: Dan, couldn't you just measure? Yep, I could just measure. I will measure. But first we'll
order our first batch of PCBs, and those will use the guesstimated number.
Next speedrun: KiCAD.