Electronics, Embedded Systems, and Software are my breakfast, lunch, and dinner.
I have finally managed to get a dot matrix display running full bore without resorting to outrageous voltages such as 18V on 5V logic. I have been working towards this moment for probably 2 or 3 years now.
This uses a pinned version of the hardware described in previous posts and more or less proves that it will work with that hardware with few modifications. The only major difference is that I am not using the same chip for the LED current sink. In the picture on the right it is running at an input voltage of 6.37V@325mA which is regulated down to 5.26V and 4.78V.
One major deviation from the hardware mentioned previously is that the row decoder is running at a slightly higher voltage than the rest of the circuit (5.26V). This is to remedy a problem I noticed with the MOSFETs in which their gates are not fully turned on unless the gate voltage is above the drain voltage. What I ended up doing was running a 5.1V zener off of the input from my power supply to create a higher voltage than the 7805 supplies to the rest of the circuit (including the MOSFETs). This is marvelously ineffecient and seems to be causing the 7805 to heat up, but that also might be from the problem that I will go over next.
The other major deviation is how I am running the column sink driver. I ordered one that I thought was almost the exact same as the one I planned to use on the PCB, but doesn't behave like the datasheet says it should. I ended up having to take away the external resistor and replace it with a direct short to ground. This effectively removed the current cap and now the chip is drawing 120mA by itself (which is bad). In contrast, the chip draws 20mA with a 300Ω resistor, but half of the rows get turned off. This in itself makes no sense since this is the column sink and for something like this to happen it would have to have something to do with the row drivers (i.e. the MOSFETs and accompanying logic), but it consistently shows up when I put a resistor on the resistor pin of the column sink chip. However, if a row starts flickering irregularly all I have to do is increase the input voltage a bit and it goes away. Any ideas as to why this would happen are appreciated (leave a comment).
From a coding standpoint I think I grossly overestimated how hard it would be to strobe this display. The chip is running at 12MIPS and so far is able to render an entire 40x16 buffer at 30fps with a ton of idle time. I have not yet implemented the grayscale dimming functionality, but even that won't add much to the overhead. Looking at my scope I can see that out of the 584uS between rows only ~290uS is actually used for writing to the column sink and switching which row is turned on. This means that my "processor usage" slightly less than 50%. I was expecting it to be significantly more, but now I see that implementing the 8-bit parallel bus which will be the link between this and the "computer" board will be easier than I'd hoped.
They suck. While waiting for my parts for the clock to come in I have been trying to get my WirelessUSB transmitter to communicate through the USART properly. I ended up discovering that it was having trouble syncing with the start and stop bits and was giving me a bunch of frame errors when receiving, but it had no problem transmitting. I was absolutely puzzled as to why this was happening and I asked around a few forums to see if anyone could shed some light on the subject. One forum (I can't remember which one) said I should try controlling the USART "manually" without using the MCC18 libraries. To my great astonishment, it started echoing back and reading my characters properly without throwing any errors. Obviously, the MCC18 libraries don't work right. I have had similar problems with the MSSP libraries, but I was unable to manually control it at the time because of my lack of experience with it (I could do it now). Basically, I now say: DON'T USE THE MCC18 LIBRARIES