#+TITLE: Infrared head-tracker #+AUTHOR: Dylan Holmes * Demonstration (Watch especially the corners of the screen where new parts of the scene come into view as you approach.) #+HTML: #+HTML: With this project, I convert a computer monitor into a window into a virtual world: a user who looks at the monitor from different angles, or who approaches the monitor, can see more or less of the 3D scene --- exactly as with a real window. Here's how the monitor looks /without/ the infrared tracking (it's just a monitor): #+HTML: And here's the same scene with infrared tracking. Note the immersive effect as you approach: #+HTML: At a high level, the project consists of two devices: an infrared camera rig (a webcam-sized component which is fixtured to the monitor and plugged into the computer over usb) and an infrared emitter device. The infrared emitter device is worn by a single user; by detecting the position of two infrared dots, the camera can determine where, and how far away, the user is. (The camera rig does all of the computation; the emitter simply has lights.) * What else exists *Wiimote-based headtracking* was the initial inspiration for this approach, and in fact it works on the same principle. The Nintendo Wii gaming console ships with a Wii remote (which is equipped with an accelerometer and an infrared camera, and which communicates over bluetooth) and a "sensor bar" (which actually just contains infrared LEDs). Since Johnny Lee's initial demo, a community of other hackers have put together Wii-based hacks of various kinds. Inspiration: https://youtu.be/Jd3-eiid-Uw?t=149 TED talk: https://youtu.be/QgKCrGvShZs?t=19 An advantage of using Nintendo products is that they are easily attainable and cost potentially less than their individual parts due to various economic factors. A primary defect of using Nintendo products, which I've circumvented with my project, is that the parts are pre-assembled and blackboxed. We shouldn't have to buy the equipment from Nintendo itself. I also suspect that nowadays, these Nintendo products are also no longer as inexpensive as their constituent parts. *Accelerometer-based headtracking* is another popular approach. *Virtual reality headsets* are sort of the opposite of what I've done. These place you into a virtual environment, whereas I place a virtual window into the real world. It's closer to a kind of *augmented reality*, though without special glasses (You just wear a pair of LEDs; the form factor could be made, I imagine, even slimmer than I've gotten it.) * What I built - I designed a *camera rig*, consisting of a board for communicating over serial (with a computer) and over I2C (with a camera). The camera rig is enclosed in a 3D printed case and sealed with a decorative lasercut lid. [[./img/enclosure-front.jpg]] [[./img/enclosure-open.jpg]] [[./img/enclosure-inner.jpg]] # Draft versions: # [[./img/plastic-enclosure.jpg]] # [[./img/plastic-case-mk1.jpg]] - I designed several neat modular boards, including a *regulated 5V power supply* (battery case affixed to a 5V regulator) and *infrared LED daughter boards* with through-hole LEDS affixed using rivets. They have various neat features such as all sharing asymmetric (never reversible) four pin headers for vcc and ground, having LED status lights, and being arrangeable in arbitrary topologies. [[./img/power-supply-led-daughter.jpg]] - I assembled the neat modular boards into an *emitter rig*, which consists of two daughter boards and the regulated 5V power supply. They were enclosed in a *laser-cut enclosure* and affixed to a headband with *3D printed* joints. The author demonstrates the assembled device: [[./img/author.png]] Earlier incarnations: [[./img/emitter-assembled.jpg]] - The *3D printed joints* took me on a detour into the design of effective pressure-fit clips, similar to work on joinery out of wood. The clips are designed to be pretty structurally robust. [[./img/snap-fit.jpg]] I used all free (libre) software to do so: - I programmed in C, using emacs as a development environment. - I used KiCad to design circuits. - I used OpenScad to design all 3D printed objects. - I used Inkscape to draw all lasercut objects. [[http://logical.ai/make/F%20project/progress-notes.html][final hours progress log]] [[./project-2.html][long-term log]] * Materials The most expensive item was the infrared camera, a specialty device costing tens of dollars ($24). Everything else, except the acrylic, came cheaply, for fractions of dollars, and everything else, except for the bright leds and the headband, came from the fab inventory. I bought rather than made the infrared camera because it is a special-purpose sensor (high resolution infrared imaging) essential for my particular approach. I bought rather than made a headband because it seemed like the most effective way to get a flexible, adjustable, tension-based substrate for head-mounted electronics (and which has no moving parts). Many many alternatives were considered and tried, but this one emerged as simplest. Quantities tabulated below represent the amount used in the final iteration; with all due respect to the many iterations preceding them. | Item | # used | Source | Cost | |----------------------------+--------+---------------+-------------------| | Atmega328p microcontroller | 1 | Digikey | | | Superbright Infrared LEDs | 2 | [[https://www.adafruit.com/product/388][Adafruit]] | $0.32 ($8 for 25) | | Green LEDs | 3 | Digikey | | | Infrared camera | 1 | [[https://www.digikey.com/catalog/en/partgroup/ir-positioning-camera-for-arduino/71049][Digikey]] | $24 | | Acrylic | | [[http://www.jfreeman.com/][JFreeman Zyro]] | ($104 for panes) | | PLA (3d printing material) | | [[https://3dprinter.sindoh.com/en/supplies/cartridge/11][Sindoh]] | ($30 / 700g) | | 1mm Rivets | 4 | | | | SMD Resistor --- 1000Ω | ~4 | | | | SMD Resistor --- 100Ω | ~2 | | | | SMD 2x02 connectors | ~7 | | | | SMD 2x03 connector | 1 | | | | Ribbon cable | | | | | 5V, 1A regulator | 1 | | | | 9V battery | | | | | 9V battery case | | | | | Headband | | | | (I'm not including things like endmills, solder stations, etc., as these facilitate, but are not exactly included in, the final product.) * Questions answered 1. Can you reasonably track the position of a person's head in 3D using only two infrared LEDs and a small infrared camera? Yes, suprisingly well: very fast framerate, convincingly accurate position reconstruction, suprisingly high distances from the camera. 2. How do you power a wearable device? In the clunky case, you can power it using a 9V battery with a voltage regulator. Button cell batteries (e.g. watch batteries) are another option, but there are power supply requirements to consider. 3. What makes an effective snap-fit joint? Things like including fillets and tapered legs are well-documented. I also discovered the importance of using PLA for 3D printing, as PLA is more flexible than most other alternatives---and some flexibility is key. 4. How much power do you need? How bright do the LEDs have to be? The camera can see you from pretty far away, but the lights must be bright. Fab lab LEDs are serviceable, but at least in my experience, within their allowable current range they can only be seen at short distances. Adafruit LEDs can tolerate 100mA steady current. 5. How do I use this camera? The camera is basically undocumented---documentation includes operating voltages, the fact that it communicates over I2C, and some contradictory/underspecified incantations for dealing with it. The solution was a kind of treasure hunt for learning how to deal carefully with undocumented devices. 6. For debugging purposes, how do you tell whether infrared LEDs are working? If they're fab lab LEDs, you can use most smartphone cameras. You can also test current, or use one of the web cam scripts. 7. How do you, generally speaking, affix things to other things? 8. How do you include through-hole resistors in a PCB design? * Evaluation I evaluate the success of my project's operation based primarily on the fact that it (a) has a successful information pipeline from camera to computer, from nodejs server to html client; (b) locates lights and reconstructs 3D position surprisingly robustly given the small amount of data; (c) works at close and far range, due to the especially bright LEDs; (d) creates an effect that's engaging and occasionally pretty realistic. I evaluate the success of my workflow by the fact that I was able to successfully implement spiral-based design for the first time. I had a design that worked badly, with dim LEDs; then iterated from there. Every time I got sidetracked or blocked by something overly ambitious, I was able to readjust to an incremental minimal working project. I evaluate the success of what I learned from my project based on an innumerable (though still pretty carefully logged) collection of mistakes and/or life-changing new skills. I also include some extra learning objectives that I had hoped to meet, including learning how to use solder paste on an ATMega; learning register manipulation for TWI/I2C communication; learning about the principles snap-fit joints. I evaluate the success of my project's side effects by all the useful modular scripts, board templates, and 3D designs that I produced: there's a small zoo of modular interconnectable boards, parametric 3D designs for snap-fit joints and chamfers, scripts for communicating over serial, adding extra width to traces, converting videos, etc. * Implications From a platform perspective, the implications are pretty exciting: - *Immersive environments*. This project demo shows that decently real "window-like" experiences are possible. Future improvements include adding more LEDs so as to more completely reconstruct the physical position/orientation of the user, using multiple cameras for stereo, and moving from rendered scenes to, perhaps, telepresence, viewpoint-dependent UI for CAD software, etc. - *The infrared approach* This proof-of-concept infrared demo shows generally that infrared emission and detection is totally possible at impressive distances, without expensive equipment. Using the same principle, you could design a digital whiteboard (where an infrared camera watches the movements of an LED-equipped dry-erase marker to store the writing digitally), or a cheap multi-finger-tracker (watch reflected infrared light off of fingers). In fact, the board design and firmware remain the same in any case; what changes is the configuration of physical parts and the high-level software that interprets the infrared lights. - *Nodejs as web-enabled scripting language*. The development of this project resulted in at least one useful script, =serial.js=, which replicates the behavior of programs like =term.py= or =picocom=; these scripts can be useful for people that have a prior reason to like/use Nodejs --- including wanting to make threejs applications. * What's next Using this same approach, you could: 1. Try stereo vision or multiple leds. 2. Try a strobing approach. The adafruit LEDs are designed to take 100mA normally and up to 1A (!!) of current for short periods of time. Maybe flashing the lights with a heavy current source would produce better detection at longer distances. 3. Communicate with the computer over bluetooth instead of over an ftdi cable. 4. Include an accelerometer. 5. Streamline the headband 6. Interface with the computer as a general-purpose usb device so that head position can, e.g., behave like a mouse or a joystick. 7. Write a plugin for CAD software to peer around a 3D shape as you design it. 8. Instead of a rendered 3D scene, use pre-recorded wide-angle videos so that you can feel like you're watching a park scene, for example. 9. Instead of a rendered 3D scene, use a moving live camera (or just several static cameras at different angles) to display different views of a real scene to a user. (Telepresence) 10. Make a digital whiteboard --- the camera watches where you draw with a pen that has an led attached. (Same infrared principle.) * Recreate this project yourself. You can recreate this project yourself using the following [[http://logical.ai/make][files]]. This is everything you need. (You can consult the materials list for what parts you'll need.) It uses only free (libre) software and pure C code, no Arduino libraries required. 1. Build the camera rig. 1. Build the *I2C-ready board*. [fn::Honestly, I used an ATMega 328P (as used in Arduinos) because I wanted to learn how to use and solder one in this class, and because it has built-in register support for hardware I2C --- you could more economically, and just as easily, do this project by bitbanging I2C on an ATTiny 44, which I did in an earlier iteration.] 2. Program the *I2C-ready board* with *software* for communicating over serial (with the computer) and I2C (with the camera). [fn::It uses an external I2C library which is pretty unnecessary given that they simply set clusters of registers --- given more time, I would have written my own subroutines following Neil's naming conventions and function signatures for his bitbanging I2C demo.] Do note the position of the ground pin; the FTDI cable needs to be plugged in "upside down". 3. You can, for example, test Neil's blinking lights program for the 328p. (Note: because my board has an oscillator, you'll need to set fuses for the mega328p which is slightly different than setting it for the attiny44). You can test serial output, too. 4. Source the infrared camera and plug it in. You can test that you can get a response (and not an infinite delay) from the camera when you send an I2C request to its address. (Visualization comes later.) 5. 3D print the *camera enclosure*. Material can vary, given that it serves simply to house the electronics. 6. Affix the board to its special place in the enclosure; the 6-pin header should be visible to the outside. I used a small amount of double-sided tape to keep it in place. The perfect fit of the enclosure does the rest. 7. Laser cut a *nice cover* for the *camera enclosure*. 8. Plug in the camera (the body of the camera contains a key to tell you which wire is which; red:vcc, black:gnd, yellow: sda, green:scl). Coil up the camera wire so that it fits nicely within the enclosure. 9. Completely remove the top hexagonal screw from the camera. Push the camera through the hole in the *nice cover*. The top of the camera is labeled TOP; twist the camera until TOP is on top. Then replace the hexagonal screw, tightening it so that on either side, the two screws sandwich the camera body to the nice cover. 10. Fixture the nice cover to the camera enclosure, forming the *camera rig*. Plug in an FTDI cable. Note the location of the ground pin; the cable goes in "upside down". 2. Build the emitter rig. 1. Mill and stuff the *9V-to-5V battery converter board*. 2. Acquire a 9V battery and 9V battery case; leave the 9V battery out for now. Use fixture tape (double-sided tape) to affix the converter board to the back of the case. For convenience, the unregulated voltage lines should be on the side of the battery case that has red and black wires. 3. Solder the red and black wires on the battery case to the unregulated 5V and ground traces on the *9V-to-5V battery converter board*. Hint---Solder paste can be used like an extra hand: put solder paste on the traces, then put the wire into the paste, holding it in place with tweezers, then heat it with a solder iron in your other hand. Hint---be careful not to bridge Vcc and ground in this process. Hint---solder is not really structurally trustworthy. You have now built an *regulated 5V power supply*. Add the battery and flip the board's switch ("up" means on, as with a light switch) to test whether the board works. If it does, the LED should light up. 4. Mill and stuff two (2) *through-hole LED daughter boards* (This is a sort-of two-sided board, which requires either an Othermill, or the alignment-hole trick on the Roland. It also requires the use of "1mm rivets" in the holes.) 5. Laser cut an *acrylic enclosure* for the LED daughter boards. 6. 3D print two (2) *joints*. The material must be somewhat flexible, so e.g. PLA is preferred over other more rigid plastics. 7. Affix the LED daughter boards to the acrylic enclosure, creating an *LED bar*. (I used clear tape, but there are better approaches). Take care --- the LED daughter boards have four pin headers that must be exposed in order to be powered. Also take care --- the LED daughter boards have a specific orientation. The "upper left" pin on the four pin connector is always Vcc and the "lower left" pin on the four pin connector is always Ground, and the other two pins are disconnected. If you orient the daughter boards all the same way, the ribbon cables connect nicely without any twists. 8. Author's note --- you can test the *LED daughter boards* by connecting them to power, e.g. the *regulated 5V power supply*. Everything interconnects via four-pin connectors on ribbon cable, which you can make yourself. If the LED daughter board is working properly, the green LED will light up. The infrared LEDs must be tested by other means, e.g. testing whether current is passing through, or inspecting with an infrared camera or with a smartphone (many smartphones, though not all, can see partly into the infrared, which appears as purple light. In my experience, the infrared leds in the fab inventory are visible to phones---but the superbright adafruit leds I used are not; different frequency range.) 9. To construct a *tailor-made ribbon cable*, first cut a four-strand ribbon cable to approximately the width of the separation between the led daughter boards on the *LED bar* (slightly longer is okay). Attach a 4-pin connector to one end of the ribbon cable, but not the other. 10. Plug the ribbon cable into one of the daughter boards and thread it through the rectangular holes so that it passes behind the LED bar to reach the second LED daughter board. Thread the ribbon back up to reach the second LED daughter board. 11. Slide another 4-pin connector onto the ribbon cable where it will give power to the second LED daughter board. Hint---for convenience, the 4-pin connector should be facing the same direction as the other 4-pin connector. Clamp the 4-pin connector onto the ribbon cable once you know where it must be positioned, and trim off any excess (for aesthetic). This forms the *tailor-made ribbon cable* which is threaded through the *LED bar*. It can't be unplugged without cutting off the 4-pin connectors; this is an aesthetic design choice for the finished product. 12. Using the two joints, affix the *LED bar* to a headband you have sourced. The joints should be at the middle of the headband, centered "on the forehead". 13. Make another small four-strand ribbon cable with connectors on both ends. Plug one end into the *regulated 5V power supply* and the other end into the *LED bar*. Now you have an *emitter rig*. 3. Run the software. 1. Install node and the libraries we require, as documented during my networking week. 2. Using node, run =ir-serial.js=. It defines several useful subroutines, including a serial communication program identical to Neil's =term.py= (see Networking week for details). In its default mode, it receives camera input over serial and transmits it over websockets to a waiting html page. 3. Open an HTML page such as =dxh.render.infrared.2.html=. The UI should show that the websockets connect successfully and that the camera is connected. 4. Turn on the emitter rig, and enjoy! 5. (Note that resetting is clunky; for reasons that I suspect are simple but which escape my currently-available capacity to determine, whenever you refresh the webpage, nodejs will crash; you'll need to rerun it every time before you refresh the page) ** Surprisingly little code required! To make snap-fit joints: #+BEGIN_SRC scale = 1; // https://www.3dhubs.com/knowledge-base/how-design-snap-fit-joints-3d-printing // http://fab.cba.mit.edu/classes/S62.12/people/vernelle.noel/Plastic_Snap_fit_design.pdf // cantilever snap-fit join contact_height = scale * 3; // height of surface making contact extrusion_length = 7 * scale; // 7mm length //extrusion_length = 1; //DEBUG band_thickness = 4.5 * scale; // smallest dimension as you pinch it band_width = 7.5 * scale; //sensor_thickness = 3.61 * scale; sensor_thickness = 5.80 * scale; sensor_width = 15.30 * scale; clip_height = band_width*2.5; clip_thickness = 2 * scale; clip_width = band_width*1.1; aperture_width = 0.25 * scale; // space between contact point aperture_width = 1 * scale; clip_intrusion = clip_width - 2*clip_thickness - aperture_width; angle_enter = 20; angle_exit = 90; module better_staple(window_width, window_height, bridge_thickness, outer_wall_thickness, inner_wall_thickness, taper_fraction = 0.5, angle_enter, angle_exit, clip_width, contact_height, extrusion_length, fillet_diameter = 2.5*scale) { // you define: // - the full window size (length, width) // - the thickness of the bridge // - the thickness of the xxinner wall (where the clip is) // - the thickness of the xxouter wall (where the flat wall is) // - the taper quotient, which determines how much the outer wall tapers // - entrance and exit angles for the slope of the clip // - the width of the clip // xxxx the offset from the clip and the opposite wall // - the extrusion size // - whether there's a circular fillet // clip_height_enter = clip_width / tan(angle_enter); clip_height_exit = clip_width / tan(angle_exit); clip_height = clip_height_enter + clip_height_exit + contact_height; //echo(clip_height_enter, clip_height_exit, contact_height); linear_extrude(height=extrusion_length) { difference() { union(){ // the inner wall polygon(points=[ [(1-taper_fraction)/2 * inner_wall_thickness, 0], [0, clip_height + window_height], [inner_wall_thickness, clip_height + window_height], [(1+taper_fraction)/2 * inner_wall_thickness, 0] ]); // the bridge polygon(points=[ [0, clip_height+window_height], [0, clip_height+window_height+bridge_thickness], [inner_wall_thickness + outer_wall_thickness + window_width, clip_height+window_height+bridge_thickness], [inner_wall_thickness + outer_wall_thickness + window_width, clip_height+window_height]]); // the outer wall polygon(points=[ [inner_wall_thickness+window_width, 0], [inner_wall_thickness+window_width, clip_height + window_height], [inner_wall_thickness+window_width+outer_wall_thickness, clip_height+window_height], [inner_wall_thickness+window_width+outer_wall_thickness, 0] ]); // clip translate([outer_wall_thickness*(1-taper_fraction/2), 0]) polygon(points=[[0,0], [clip_width, clip_height_enter], [clip_width, clip_height_enter + contact_height], [0, clip_height_enter + contact_height + clip_height_exit]]); } translate([inner_wall_thickness, clip_height + window_height]) circle(d=fillet_diameter,center=true); } } } rotate([0,-90 * 1,0]) { //b/c anisotropic 3d printing better_staple( window_height=band_width*1.05, window_width = band_thickness*1.05, bridge_thickness=clip_thickness*3, inner_wall_thickness=clip_thickness*2.5, outer_wall_thickness=clip_thickness*2, clip_width = band_thickness*0.5, contact_height = band_thickness*0.8, taper_fraction=0.5, angle_enter = 45, angle_exit = 80, extrusion_length = 10 * scale ); // for sensor translate([band_thickness*1.05*2+clip_thickness*4*2 - clip_thickness*0.15,clip_thickness*12.4,0]) scale([-1,-1,1]) better_staple( window_height=sensor_width*1.1, window_width = sensor_thickness*1.1, bridge_thickness=clip_thickness*3, inner_wall_thickness=clip_thickness*2.5, outer_wall_thickness=clip_thickness*2, clip_width = band_thickness*0.3, contact_height = band_thickness*0.3, taper_fraction=0.5, angle_enter = 45, angle_exit = 80, extrusion_length = 10 * scale ); % translate([clip_thickness*2,clip_thickness*2.9,0]) rotate([0,90,0]) band(); } #+END_SRC Firmware for I2C and serial communication: #+BEGIN_SRC C #include #include #include #include // i2c address #define I2C_ADR 0xB0 // ( 0x58 << 1 ) #define output(directions,pin) (directions |= pin) // set port direction for output #define set(port,pin) (port |= pin) // set port pin #define clear(port,pin) (port &= (~pin)) // clear port pin #define pin_test(pins,pin) (pins & pin) // test for port pin #define bit_test(byte,bit) (byte & (1 << bit)) // test for bit set //#define bit_delay_time 8.5 // bit delay for 115200 with overhead #define bit_delay_time 102 // 9600 baud #define bit_delay() _delay_us(bit_delay_time) // RS232 bit delay #define half_bit_delay() _delay_us(bit_delay_time/2) // RS232 half bit delay #define char_delay() _delay_ms(10) // char delay #define i2c_delay() _delay_ms(50) #define serial_port PORTD #define serial_direction DDRD #define serial_pins PIND // TX/RX from computer's perspective. in = computer sends TX. out = computer receives on RX. #define serial_pin_in (1 << PD1) #define serial_pin_out (1 << PD0) #define max_buffer 25 #define led_port PORTC #define led_direction DDRC #define led_pin_1 (1 << PC1) #define led_pin_2 (1 << PC0) #define led_delay() _delay_ms(100) // LED delay #define SCL_CLOCK 100000L // I2C clock in Hz #define IR_CAMERA_ADDRESS 0xB0 // 8 bit address (0x58 << 1) // // SERIAL // void get_char(volatile unsigned char *pins, unsigned char pin, char *rxbyte) { // // read character into rxbyte on pins pin // assumes line driver (inverts bits) // *rxbyte = 0; while (pin_test(*pins,pin)) // // wait for start bit // ; // // delay to middle of first data bit // half_bit_delay(); bit_delay(); // // unrolled loop to read data bits // if pin_test(*pins,pin) *rxbyte |= (1 << 0); else *rxbyte |= (0 << 0); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 1); else *rxbyte |= (0 << 1); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 2); else *rxbyte |= (0 << 2); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 3); else *rxbyte |= (0 << 3); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 4); else *rxbyte |= (0 << 4); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 5); else *rxbyte |= (0 << 5); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 6); else *rxbyte |= (0 << 6); bit_delay(); if pin_test(*pins,pin) *rxbyte |= (1 << 7); else *rxbyte |= (0 << 7); // // wait for stop bit // bit_delay(); half_bit_delay(); } void put_char(volatile unsigned char *port, unsigned char pin, char txchar) { // // send character in txchar on port pin // assumes line driver (inverts bits) // // start bit // clear(*port,pin); bit_delay(); // // unrolled loop to write data bits // if bit_test(txchar,0) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,1) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,2) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,3) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,4) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,5) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,6) set(*port,pin); else clear(*port,pin); bit_delay(); if bit_test(txchar,7) set(*port,pin); else clear(*port,pin); bit_delay(); // // stop bit // set(*port,pin); bit_delay(); // // char delay // bit_delay(); } void put_string(volatile unsigned char *port, unsigned char pin, char *str) { // // print a null-terminated string // static int index; index = 0; do { put_char(port, pin, str[index]); ++index; } while (str[index] != 0); } int main() { int i; int r = 0; unsigned char data_buf[16]; int p0,p1,p2,p3; int Ix1,Iy1,Ix2,Iy2; int Ix3,Iy3,Ix4,Iy4; int s; char buf[255]; CLKPR = (1 << CLKPCE); CLKPR = (0 << CLKPS3) | (0 << CLKPS2) | (0 << CLKPS1) | (0 << CLKPS0); // // initialize output pins // set(serial_port, serial_pin_out); output(serial_direction, serial_pin_out); output(led_port, led_pin_1); output(led_port, led_pin_2); clear(led_port, led_pin_1); clear(led_port, led_pin_2); i2c_init(); // i2c_master_init(); // IR sensor initialize //p0=0x72; p1=0x20; p2=0x1F; p3=0x03; // sensitivity //p0=0xC8; p1=0x36; p2=0x35; p3=0x03; p0=0xAA; p1=0x64; p2=0x63; p3=0x03; //p0=0x96; p1=0xB4; p2=0xB3; p3=0x04; //p0=0x96; p1=0xFE; p2=0xFE; p3=0x05; i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x30 ); i2c_write( 0x01 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_delay(); i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x00 ); i2c_write( 0x02 ); i2c_write( 0x00 ); i2c_write( 0x00 ); i2c_write( 0x71 ); i2c_write( 0x01 ); i2c_write( 0x00 ); i2c_write( p0 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_delay(); i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x07 ); i2c_write( 0x00 ); i2c_write( p1 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_delay(); i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x1A ); i2c_write( p2 ); i2c_write( p3 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_delay(); i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x33 ); i2c_write( 0x03 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_delay(); i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x30 ); i2c_write( 0x08 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_delay(); while (1) { i2c_start_wait(I2C_ADR+I2C_WRITE); i2c_write( 0x36 ); i2c_stop(); if (r!=0) { rs_puts("Error: no ACK\n"); } i2c_start_wait(I2C_ADR+I2C_READ); for (i=0;i<16;i++) { data_buf[i]=0; } for (i=0;i<16;i++) { if (i!=15) { data_buf[i] = i2c_readAck(); } else { data_buf[i] = i2c_readNak(); } } i2c_stop(); if (r!=0) { rs_puts("Error: no ACK (2)\n"); } Ix1 = data_buf[1]; Iy1 = data_buf[2]; s = data_buf[3]; Ix1 += (s & 0x30) <<4; Iy1 += (s & 0xC0) <<2; Ix2 = data_buf[4]; Iy2 = data_buf[5]; s = data_buf[6]; Ix2 += (s & 0x30) <<4; Iy2 += (s & 0xC0) <<2; Ix3 = data_buf[7]; Iy3 = data_buf[8]; s = data_buf[9]; Ix3 += (s & 0x30) <<4; Iy3 += (s & 0xC0) <<2; Ix4 = data_buf[10]; Iy4 = data_buf[11]; s = data_buf[12]; Ix4 += (s & 0x30) <<4; Iy4 += (s & 0xC0) <<2; /* put_char(&serial_port, serial_pin_out, 1); */ /* put_char(&serial_port, serial_pin_out, 2); */ /* put_char(&serial_port, serial_pin_out, 3); */ /* put_char(&serial_port, serial_pin_out, 4); */ put_char(&serial_port, serial_pin_out, data_buf[0]); put_char(&serial_port, serial_pin_out, data_buf[1]); put_char(&serial_port, serial_pin_out, data_buf[2]); put_char(&serial_port, serial_pin_out, data_buf[3]); put_char(&serial_port, serial_pin_out, data_buf[4]); put_char(&serial_port, serial_pin_out, data_buf[5]); put_char(&serial_port, serial_pin_out, data_buf[6]); put_char(&serial_port, serial_pin_out, data_buf[7]); put_char(&serial_port, serial_pin_out, data_buf[8]); put_char(&serial_port, serial_pin_out, data_buf[9]); put_char(&serial_port, serial_pin_out, data_buf[10]); put_char(&serial_port, serial_pin_out, data_buf[11]); put_char(&serial_port, serial_pin_out, data_buf[12]); put_char(&serial_port, serial_pin_out, data_buf[13]); put_char(&serial_port, serial_pin_out, data_buf[14]); put_char(&serial_port, serial_pin_out, data_buf[15]); /* put_char(&serial_port, serial_pin_out, 5); */ /* put_char(&serial_port, serial_pin_out, 6); */ /* put_char(&serial_port, serial_pin_out, 7); */ /* put_char(&serial_port, serial_pin_out, 8); */ /* sprintf(buf,"%d," ,Ix1); rs_puts(buf); */ /* sprintf(buf,"%d," ,Iy1); rs_puts(buf); */ /* sprintf(buf,"%d," ,Ix2); rs_puts(buf); */ /* sprintf(buf,"%d\n",Iy2); rs_puts(buf); */ i2c_delay(); i2c_delay(); } return 0; } #+END_SRC On the client side: #+BEGIN_SRC web Infrared render camera plotter
Not connected to anything.
#+END_SRC On the server side: #+BEGIN_SRC javascript // // ir-serial.js // by Dylan Holmes // // based on hello-serial.js // requires nodejs, npm, serial, express@4.15.2, ws, express-ws // BASIC PARAMETERS const server_port = 8081 const client_address = '127.0.0.1' const serial_port = "/dev/ttyUSB0" var baud = 9600 // 19200 // 9600// 115200 example_terminal = function(as_text = true) { // Create a terminal for speaking with a pcb board over serial. // Performs the same function as term.py (from class) or picocom. // Set up serial connection const SerialPort = require('serialport') const Readline = require('@serialport/parser-readline') const port = new SerialPort(serial_port, {baudRate: baud, databits:8, dtr: true, }) // Serial-opening errors will be emitted as an error event port.on('error', function(err) { console.log('Error: ', err.message) }) // Whenever serial data is received, interpret the data as // keycodes and print as a string. // See: https://thisdavej.com/making-interactive-node-js-console-apps-that-listen-for-keypress-events/ port.on('data', function(data) { console.log(as_text ? data.toString('utf8') : data) }) // Listen to keyboard in nodejs terminal const readline = require('readline') readline.emitKeypressEvents(process.stdin) process.stdin.setRawMode(true) console.log("Ready for input: ") process.stdin.on('keypress', (str, key) => { // Ctrl-c exits the terminal if(key.ctrl && key.name === 'c') process.exit() var debug = false if(debug) console.log("You pressed the '",str,"' key.", key) port.write(key.sequence) }) } infrared_camera_terminal = function() { // Speak with a PCB board that is speaking with an infrared // positioning camera. (part id: 1738-1250-ND) // Set up serial connection const SerialPort = require('serialport') const Readline = require('@serialport/parser-readline') const port = new SerialPort(serial_port, {baudRate: baud, databits:8, dtr: true, }) var stream = [] var pos_x = new Array(4) var pos_y = new Array(4) var upper_bits var index = null // Serial-opening errors will be emitted as an error event port.on('error', function(err) { console.log('Error: ', err.message) }) // Whenever serial data is received, interpret the data as // the special format of the infrared positional camera. var process_ir_data = function() { while( stream.length >= 16 ) { stream.shift() // Look for framing data if( stream[0] == 1 && stream[1] == 2 && stream[2] == 3 && stream[3] == 4 ) { pos_x[0] = stream[4] pos_y[0] = stream[5] upper_bits = stream[6] pos_x[0] += (upper_bits & 0b00110000) << 4; pos_y[0] += (upper_bits & 0b11000000) << 2; pos_x[1] = stream[7] pos_y[1] = stream[8] upper_bits = stream[9] pos_x[1] += (upper_bits & 0b00110000) << 4; pos_y[1] += (upper_bits & 0b11000000) << 2; pos_x[2] = stream[10] pos_y[2] = stream[11] upper_bits = stream[12] pos_x[2] += (upper_bits & 0b00110000) << 4; pos_y[2] += (upper_bits & 0b11000000) << 2; pos_x[3] = stream[13] pos_y[3] = stream[14] upper_bits = stream[15] pos_x[3] += (upper_bits & 0b00110000) << 4; pos_y[3] += (upper_bits & 0b11000000) << 2; console.log(stream[4], stream[5], stream[6]) console.log("x: ",pos_x[0],",\ty: ",pos_y[0]) // console.log("x: ",pos_x[1],",\ty: ",pos_y[1]) stream = stream.slice(16) return; } } } port.on('data', function(data) { stream.push(...data) // apparently js has an ... operator. process_ir_data() console.log(data) }) // Listen to keyboard in nodejs terminal const readline = require('readline') readline.emitKeypressEvents(process.stdin) process.stdin.setRawMode(true) console.log("Ready for input: ") process.stdin.on('keypress', (str, key) => { // Ctrl-c exits the terminal if(key.ctrl && key.name === 'c') process.exit() var debug = false if(debug) console.log("You pressed the '",str,"' key.", key) port.write(key.sequence) }) } // example_terminal(false) // // infrared_camera_terminal() // cf example_serial_websocket_server from hello-serial.js fake_websocket_server = function() { // Send synthetic (artifically-generated) serial data to the // websockets to confirm that it works. // Used in connection with dxh.plot.infrared.html. // Set up the web socket server const express = require('express') const app = express() require('express-ws')(app); // Connections maintain a list of who has connected to the // websocket server, so we know who needs to hear serial messages. var connections = [] app.get('/', (req, res) => res.send('(The infrared-sockets-serial server is running.)')) app.listen(server_port, () => console.log(`Infrared Sockets-serial. Listening on port :${server_port}`)) var simulate_data app.ws('/ftdi', function(ws, req) { // sockets <-- serial simulate_data = function(data) { console.log("artificial data:", data) // translate into camera format data = data[0] var upper_bits = 0 upper_bits |= (data[0] & 0b1100000000) >> 4 upper_bits |= (data[1] & 0b1100000000) >> 2 upper_bits |= 0b1111 data[0] &= 0xff data[1] &= 0xff data.push(upper_bits) data.unshift(0) for(var i=0;i<6;i++) { data.push(0xff) } var message = {} message.data = data console.log(message) // end translate into camera format ws.send(JSON.stringify(data)) } var t = 0 setInterval(function(){ t+=0.1; var pt = [400+100*Math.cos(t), 100+50*Math.sin(t)] pt[0] = Math.floor(pt[0]) pt[1] = Math.floor(pt[1]) simulate_data([pt]) }, 40) // sockets --> serial ws.on('message', function(msg) { console.log("html page sent:", msg) }); }); } infrared_serial_websocket_server = function() { // Set up the web socket server const express = require('express') const app = express() require('express-ws')(app); // Connections maintain a list of who has connected to the // websocket server, so we know who needs to hear serial messages. var connections = [] app.get('/', (req, res) => res.send('(The infrared-sockets-serial server is running.)')) app.listen(server_port, () => console.log(`Infrared Sockets-serial. Listening on port :${server_port}`)) // Set up serial connection const SerialPort = require('serialport') const Readline = require('@serialport/parser-readline') const port = new SerialPort(serial_port, {baudRate: baud, databits:8, dtr: true, }) port.on('error', function(err) { console.log('Error: ', err.message) }) // Enable communication between sockets and serial. This is a // little messy (nested), because we have to do the sockets <-- // serial connection in a variable scope where ws is defined. app.ws('/ftdi', function(ws, req) { // sockets <-- serial port.on('data', function(data) { // console.log("serial connection sent:", data.toString('utf8')) //ws.send(data.toString('utf8')) ws.send(JSON.stringify(data)) }) // sockets --> serial ws.on('message', function(msg) { console.log("html page sent:", msg) port.flush() port.write(Buffer.from(msg)) port.flush() }); }); } //example_terminal(false); infrared_serial_websocket_server() // fake_websocket_server() #+END_SRC #+HTML: