Testing virtual 2501 card reader
I fired up the system and tested the 2501 card reader function by toggling in some hand code to issue the XIO Init Read to do reading, an interrupt handler that grabbed the DSW along with a reset, paused, then went back to the mainline. My mainline is the XIO read plus a pause and loopback, letting me examine the buffer contents after each read.
I used a few card decks from the 1130 simulator as the data to read and check. When I first fired up my hand program, I saw it issue the XIO but there was no sign that my fpga nor the Python program was active processing it. I looked over the fpga first to be sure it was capturing the XIO IR and the Python program, but also stuck in some diagnostic LEDs and messages to help. I then discovered that I had forgotten to click the checkbox to activate the virtual 2501 support. I went back out to test again and found a couple of minor flaws in the Python code that were introduced because I collapsed several transactions (last XIO function, WCA address) into one.
I discovered that my unicode oriented read of 029 format files (IBM 1130 ascii files of card images) fails on Ubuntu Linux but works on Windows. I will need to work on this but initially I will just used native 1130 simulator (Binary) format files.
My testing worked great for the file I entered, until I ran into the end of the deck when the card reader should go 'not ready' unless the 'last card' checkbox is on. My primitive hand code does not check for not ready, but just ssues another XIO Init Read even though the file is done, My python code doesn't watch for this and attempts to read the next image from the closed file, causing an error. I can protect against this in the Python code.
A real reader acts the same way when the hopper empties. It signals not ready, then if the operator pushes the start button with no new cards in the hopper, it goes ready in last card mode, so that the next XIO Init Read will read the final card and then the operation complete interrupt also has the 'last card' bit set in the DSW to tell the software that this is end of file rather than a temporary case where more cards have to be loaded in the hopper.
I will fix up the Python code then step through the card images in my hand code on the 1130, seeing that it behaves properly for both the last card sequence and the 'out of cards' not ready case. If this is fixed, the only issue is converting 029 format unicode files to the native binary format as I open such files.
I still had a flaw in how I deal with my incorrect code that is issuing XIO Init Read on the virtual 2501 when it is not ready. The solution is in the FPGA, where I won't record the last XIO Function Code if either Not Ready or Busy is on in the DSW. The Python code won't see any XIO in that case, until it loads a new file and sets the DSW back to ready (turning off Not Ready bit).
The code I wrote to handle the Unicode ASCII (needed for two special characters on card images that are not part of the 8 bit ASCII character set) used the windows generic encoding 'mbcs' but Unix and Linux have no default. I changed the code to explicitly request 'UTF-8'. Testing showed this worked perfectly.
I ran multiple decks through the virtual card reader, including a mix of binary and Unicode ASCII, with everything working as expected. The last card image of each file will be left 'in the reader' with the reader not ready, unless the 'last card' checkbox was selected which allows the last card to also be read. The DSW showed last card or not ready exactly as would occur with a real 2501.
At a later time, I will test the boot operation on the virtual 2501. This will read one card image in the special program load mode, stuck the 80 columns in memory starting at location 0, then trigger my pseudo-program-load sequence.
Building virtual 1442 functionality
I jumped into modifying and building the logic for the 1442 reader/punch since a few aspects of that design caught my attention. The PC side is pretty clean, while the FPGA side is a bit more complex mainly because of looping FSMs emitting column by column interrupts and the timing modeling for feeding and other operations.
The PC will watch for XIO instructions and respond to XIO Control and XIO Init Read (a fake operation that the programmer should never execute for a 1442). Any of the XIO controls will cause a feed cycle, which involves pushing down one card image from the PC file. I decided that users must open a file of 'blank cards' if they are punching, otherwise there are data cards in the file.
The FPGA will see the card pushed, start emitting column interrupts at the proper pace, handle XIO Read or XIO Write within the fpga by feeding or updating the two card buffers, and claim to have seen an XIO Init Read when the last column of read or punch is processed and the elapsed time matches the performance of a 1442 model 7.
The fpga will hold off on the operation complete interrupt (IL4) until it sees that the PC has fetched the last of the punch buffer, then it triggers the interrupt and clears it after the XIO Sense DSW with Reset bit is processed.
The PC program sees the XIO Init Read code and fetches the punch buffer contents. If an output file is open on the PC, it writes that image to the end of the file. In either case it goes back to polling for the next XIO Control operation.