Author Archives: Robert

Configuring Multitech MDOT for TTN

I have a Multitech MDOT-BOX for testing. Configuring it for TTN requires the following connection to a computer, after which AT commands can be used to probe and set parameters. The following resets the MDOT to factory defaults and shows the configuration overview.

AT&F
AT&V

Firmware: 		2.0.0
Library : 		0.0.9-14-g4845711
Device ID:		00:80:00:00:00:00:b3:76
Frequency Band:		FB_868
Public Network:		off
Network Address:	00000000
Network ID:		6c:4e:ef:66:f4:79:86:a6
Network ID Passphrase:	MultiTech
Network Key:		1f.33.a1.70.a5.f1.fd.a0.ab.69.7a.ae.2b.95.91.6b
Network Key Passphrase:	MultiTech
Network Session Key:	00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00
Data Session Key:	00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00
Network Join Mode:	OTA
Tx Data Rate:		SF_7
Tx Power:		11
Log Level:		6
Maximum Size:		242
Minimum Size:		11
Maximum Power:		20
Minimum Power:		2
Data:			0

After adding a device to application page on the TTN console with OTA activation, the following identifiers/keys are listed on the TTN console page for the device

Device EUI
Application EUI
App Key
Device Address
Network Session Key
App Session Key

From the Multitech documentation: In OTA mode, the device only needs to be configured with a network name (+NI=1,name) and network passphrase (+NK=1,passphrase). The network session key, data session key, and network address are all automatically configured.

With ABP activation, the following identifiers/keys are listed on the TTN console page for the device.

Device EUI
Application EUI
Device Address
Network Session Key
App Session Key

From the Multitech documentation: In Manual mode, there is no join request sent and the device must be manually configured with a network address (+NA), a network session key (+NSK), and a data session key (+DSK). The device must be provisioned with the network server as well.

Mapping between TTN and MDOT keys/identifiers

The Device EUI is hardware dependent and should therefore be copied from the MDOT device to the TTN console. The other keys/identifiers are application dependent and should be copied from the TTN console to the device.

For OTA the following settings need to be updated:

AT+PN=1
AT+NJM=1
AT+NI=0,70B3D57EF000451B
AT+NK=0,5108008928062F42980E42C20AC1E4E1
AT&W

TTN direction MDOT
Device EUI <- Device ID
Application EUI -> Network ID
App Key (only in OTA) -> Network Key

I could not get ABP to work.

TTN direction MDOT
Device EUI <- Device ID
Application EUI -> xxx
App Key (only in OTA) -> xxx
Device Address -> xxx
Network Session Key -> xxx
App Session Key -> xxx

EEG combined with VR

We recently had a meeting at the Astron radio telescope for the COGITO project with Daniela de Paulis, Stephen Whitmarsh, Guillaume Dumas and others. One of the goals of that meeting was to try out the combination of the EEG system with the Oculus Rift VR system.

For the COGITO project we are using the GTec Nautilus EEG system. Our specific system comprises of a 32-channel wireless amplifier that mounts on the back of the EEG cap, in combination with EEG caps in three different sizes. The caps have 64 holes at a subset of the  locations of the 5% electrode placement standard. We are not using the “Sahara” dry electrode option, but rather the regular wet electrodes.

We started by removing all electrodes and cups from the cap, to get a clear view on which electrode sites are accessible. The central electrode locations (i.e. the z-line), temporal electrode locations and occipital electrode locations are occluded by the VR head mount. But there are still plenty of electrode locations accessible.

We are using the Nautilus in combination with wet electrodes. These consist of a small cup that is mounted in the holes of the cap. Each cup comes with a label. It is a bit fiddly to mount all the cups on the cap; not something to do every day.

The electrodes themselves are fixed to the wireless amplifier and “click” smoothly onto the cup. We made some small adjustments to the selected electrode sites to have them all fit nicely in the spaces of the VR head mount.

The Nautilus EEG system has two sets of wires, going to the left and right. That is convenient with the head mount.

Putting on the VR headset on top of the EEG cap and removing it again requires care, it is easy for electrode wires to get stuck. But once the VR headset is mounted over the EEG cap it all fits nicely and is comfortable for the subject.

Below you can see Guillaume, wearing the EEG and VR system while he is seeing the 3D movie and while EEG is recorded.

Art-Net to DMX512 with ESP8266

Update 26 May 2017 – added photo’s of second exemplar and screen shots of web interface for OTA.

Professional stage and theatre lighting fixtures are mainly controlled over DMX512. To allow a convenient interface between the EEGsynth and this type of professional lighting systems, I built an Artnet-to-DMX512 converter. It quite closely follows the design of my Artnet-to-Neopixel LED strip module.

Let me first show the finished product. It has a 5 pin XLR connector, a 2.1 mm power connector, and a multi-color status LED:

Here is the working prototype based on a NodeMCU board

Art-Net to DMX512 prototype

It consists of the following components :

  • NodeMCU Lua ESP8266 development board (replaced by a Wemos D1 mini in the final version)
  • MAX485 Module TTL To RS-485
  • common cathode RGB led
  • 2x 220 Ohm and 1x 100 Ohm resistors
  • 2-24V to 5V DC-DC Boost-Buck converter (not in the prototype)

I was first planning to follow the design of Matthias Hertel, which includes an optical isolation between the MCU board and the DMX output. But I realised that the costs of the isolation transformer and optocoupler are more than the ESP8266, so I decided to keep it simple and not  to add over-voltage protection. I would have added protection in case it had been directly coupled to the USB port of a computer, but in this case the computer connects over WiFi.

Another consideration is the voltage to drive the RS485 output. While testing, my Stairville LED flood panel seems quite happy with the 3.3V provided by the NodeMCU board. Driving the whole  MAX485 module at 3.3V is therefore an option.  But I figured that the MAX485 module can also be powered with 5V (causing a 5V differential signal on its output) while the TTL input to the module is at 3.3V.

In principle it would be possible to provide power on the micro-USB port of the NodeMCU. However, the wall of the enclosure in which I’ll put it is quite thick. This would result in the micro-USB  being too deep for some cables/chargers. Furthermore, my experience is that it is not so easy to mount the ESP8266 board firmly enough to prevent it from moving when connecting/disconnecting the cable. Hence I opted for a 5.5 x 2.1 mm panel mount DC jack connector to provide 5V power. Since there are power adapters (wall warts) with 9 and 12 Volt, I included a 2-24V to 5V DC-DC power converter.

The source code is available from github. It is based on the code for my Artnet Neopixel controller. It was especially convenient to reuse the part for initially configuring the wifi, for configuring the settings, and for over-the-air updating.

The repository on github  includes the details on how to wire the components.

Update 26 May 2017

Per Huttner asked me to make one for the next EEGsynth performance. The only change compared to the first one (see above) is that I used a 3-pin XLR connector.

Here you can see the web interface upon initial startup (while connected to the ARTNET access point). As access point it sends this “captive” configuration screen automatically, just like a login screen on a public wifi network. The wifi configuration is based on WiFiManager.

Following WiFi setup it resets and after some 10 seconds the led shoudl turn green. At that point you can connect to the web interface. If you are using OS X with bonjour or Linux with zeroconf, you can connect to http://artnet.local. If your computer does not support zeroconf (e.g. on windows) you will have to log in on your router to look up the IP address that your router has assigned to it.

In the settings you can configure the universe that is to be forwarded from Art-Net to DMX512 and the number of channels. Specifying fewer channels makes the DMX message shorter and hence allows for more frequent (and smoother) updates. The delay is the (approximate) time between DMX packets. The default is 25ms, i.e. 40 packets per second, which is approximately fastest that you can transmit with the full 512 channels.

The monitor window shows the firmware version, the uptime in seconds, the number of Art-Net packets received and the frame refresh. The frame refresh is actually not accurate, since reading from the web interface shuts down the refreshing. You can read more accurate numbers from http://artnet.local/json. 

 

 

GPS-enabled LoRaWAN temperature sensor

Together with the TTN Nijmegen community we are discussing possible applications of remote sensing nodes in Nijmegen. To get a better view on the TTN coverage in Nijmegen and to get a feel for what works (and what not), we are working on the implementation of some nodes.

The PoC2 TTN gateway will soon be installed by Michiel Nijssen at Maptools in Molenhoek. To help Michiel get started, we agreed that I would give him a fully functional node to play with. Michiel came up with a very concrete idea: it consists of a GPS-enabled temperature sensor that sends the data over LoRaWAN/TTN. Below you can find some details of a very fist implementation.

The node consists of

  • Teensy 3.2 MCU board
  • Dorji LoRa module
  • DS18b20 temperature sensor
  • Ublox NEO-M8N GPS module
  • 4k7 ohm resistor
  • small LED and 200 ohm resistor (not on photo)

I estimate that the material costs amount to 50 euro. It still needs to be soldered in a more sturdy form-factor and a battery and enclosure need to be added.

I created a corresponding application on console.thethingsnetwork.org where I configured the node with APB.

The node is running a sketch (i.e. firmware) that I developed in the Arduino IDE. On the receiving side side I am using a node.js application, which uses MQTT to connect with TTN and to receive the messages. The receiving application also implements a simple web interface that displays the most recently received data. The receiving application is running on a Raspberry Pi in my basement.

The Arduino code for the teensy can be found in my arduino repository on github in the teensy_gps_temp_ttn directory.

The node.js code for the receiving application can be found in its own repository on github.

The web interface can be found on gpstemp.robertoostenveld.nl. Note that I might not keep this particular web application running for very long, so don’t be surprised if the URL stops working. To check that it is displaying live data, you can reload ever 10 seconds or so. The counter should increase, and the numbers might be a bit different.

Getting started with Pine64

UPDATE: see at the end for some problems that I encountered after the initial install.

The Pine64 is a single board computer that resembles the Raspberry Pi, but with a 64-bit CPU, up to 2GB of RAM and available for $15-$29. It was introduced with a Kickstarter campaign which I supported. My 2GB Pine64 has been lying on a shelf for quite some time, as I was waiting for the kernel, distribution and documentation to mature.

My first installation yesterday went fine (some slight troubles to get WiFi connected), but while updating the kernel, the root disk partition completely filled up and borked the installation. Hence I have to start again. Let me now document it, as I might need to repeat the installation more than a second time.

I primarily followed the instructions from https://www.pine64.pro/getting-started-linux/ with some additional information from http://forum.pine64.org/showthread.php?tid=982. I am working off an Apple MacBook Pro computer.

After downloading the Debian Base disk image, I used 7z to unzip it:

mbp> brew install p7zip
mbp> 7z x pine64-image-debianbase-310102bsp-2.img.xz

I inserted an empty Samsung 16 GB EVO UHS-I Class 10 micro SD card and followed up with:

mbp> diskutil unmountDisk /dev/disk1
mbp> sudo dd if=pine64-image-debianbase-310102bsp-2.img of=/dev/disk1 bs=1024k

and I unmounted it again:

mbp> diskutil unmountDisk /dev/disk1

In order to configure the WiFi connection of the Pine64, I connected it to a keyboard and to my LCD TV screen and powered it up with a 5V 2A micro USB power supply. I noticed that the HDMI connection is a bit flakey, the TV repeatedly reports “no connection”; wiggling the HDMI connector brought the boot sequence back on screen. I also connected the wired ethernet connection, which – without configuration changes- obtained an IP address using DHCP from my router.

I added the following to /etc/network/interfaces

auto wlan0
iface wlan0 inet dhcp
wpa-conf /etc/wpa_supplicant/wpa_supplicant.conf
iface wlan1 inet manual

and the following to /etc/wpa_supplicant/wpa_supplicant.conf

ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
network={
    ssid="Linksys-E900"
    psk="xxxx"
    proto=RSN
    key_mgmt=WPA-PSK
    pairwise=CCMP
    group=CCMP
    auth_alg=OPEN
    priority=9
}

which I read-protected with

root@pine64# chmod 600 /etc/wpa_supplicant/wpa_supplicant.conf

Then I restarted the wifi network

ifdown wlan0
ifup wlan0

and checked for the network:

root@pine64:~# ifconfig 
eth0      Link encap:Ethernet  HWaddr 36:c9:e3:f1:b8:05  
          inet addr:192.168.1.13  Bcast:192.168.1.255  Mask:255.255.255.0
          inet6 addr: fdeb:a2d5:862:0:34c9:e3ff:fef1:b805/64 Scope:Global
          inet6 addr: fe80::34c9:e3ff:fef1:b805/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:2607 errors:0 dropped:0 overruns:0 frame:0
          TX packets:1127 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:200040 (195.3 KiB)  TX bytes:236261 (230.7 KiB)
          Interrupt:114 

lo        Link encap:Local Loopback  
          inet addr:127.0.0.1  Mask:255.0.0.0
          inet6 addr: ::1/128 Scope:Host
          UP LOOPBACK RUNNING  MTU:65536  Metric:1
          RX packets:0 errors:0 dropped:0 overruns:0 frame:0
          TX packets:0 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0 
          RX bytes:0 (0.0 B)  TX bytes:0 (0.0 B)

wlan0     Link encap:Ethernet  HWaddr 34:c3:d2:71:90:16  
          inet addr:192.168.1.14  Bcast:192.168.1.255  Mask:255.255.255.0
          inet6 addr: fe80::36c3:d2ff:fe71:9016/64 Scope:Link
          inet6 addr: fdeb:a2d5:862:0:36c3:d2ff:fe71:9016/64 Scope:Global
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:505 errors:0 dropped:1 overruns:0 frame:0
          TX packets:12 errors:0 dropped:1 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:99440 (97.1 KiB)  TX bytes:1969 (1.9 KiB)

As I ran into disk space problems yesterday – perhaps because I had done apt-get update && apt-get upgrade – I ran the update commands as follows:

resize_rootfs.sh
reboot

followed by

/usr/local/sbin/pine64_update_uboot.sh
/usr/local/sbin/pine64_update_kernel.sh
reboot

and finally

apt-get update
apt-get upgrade
timedatectl set-timezone Europe/Amsterdam

UPDATE (11 Feb 2017): My Pine64 board turns out to be quite unstable. Depending on how I plug in the HDMI cable, I get an image on my screen or not. Wiggling the HDMI connector sometimes helps, sometimes not.  Furthermore (and more serious, since I wanted to use it as headless server), if there is some serious network traffic, it crashes. Not once, but quite consistently. I read elsewhere that the build quality of the Pine64 is not very good. It might also be that the problem is in powering the board: I am using a 2A power adapter (which works fine on all my Raspberry Pi’s).

ESP-8266 Art-Net NeoPixel module

As explained in a previous post, for the EEGsynth we want to use a neopixel array that can be controlled wirelessly using the DMX512 protocol. I purchased a number of Adafruit neopixel rings with 12, 16 and 24 elements respectively. Each RGBW pixel contains a red, green, blue and white LED. For the 24-pixel ring that means that there are in total 4*24=96 LEDs of which the intensity can be set.

The ESP-8266 module is a versatile WiFi module that comes in many versions. During development I especially like the NodeMCU version, which mounts the ESP-12 module on a development board with USB connection, and the even smaller Wemos D1 mini board. The Wemos D1 mini is hardly more expensive on Ebay than the simpler bare-bone ESP-8266 modules.

The hardware connection is simple: I connected Vcc and GND directly to the Wemos D1 mini board, and connected pin D2 to the data-in of the first pixel. Although the Neopixels are specified for 5V, in my experience the Adafruit rings also work fine at 3.3V, both for power and for the serial control signal. Each LED can take up to 20 mA when fully bright, which means that all LEDs of the 24-pixel RGBW ring can take up to 24*4*20 = 1920 mA, or close to 2 A. However, not all LEDs will be at full intensity at the same time, and driving them with 3.3V rather than 5V further reduces the current. I encountered no issues powering them over the USB port of my MacBook.

For the EEGsynth we want to map a small number of control signals to aesthetically pleasing light effects. E.g. it can control the hue, the frequency with which the array flashes, or the speed with which a bright bar rotates along the ring.

I implemented the firmware as an Arduino sketch that combines a number of features. It incorporates ConfigManager for the OTA (over-the-air) configuration of the WiFi network to which it should connect. Once connected to the local wifi network, he ConfigManager also allows updating specific settings in EEPROM over a POST call to a specific URL. Settings include the number of pixels of the attached Neopixel array, whether they are RGB or RGBW, and most importantly: the mode with which the controller maps the control signals onto the LED behaviour.

The firmware listens to the Artnet protocol messages that it receives as UDP packets. The Artnet packets can be sent by the EEGsynth outputartnet module, but also by general purpose Artnet software, such as JV Lightning DmxControl, LightKey or QLC+.

The first mode that I implemented allows for full control of all LEDs. It maps the DMX512 channels like this

mode 0: individual pixel control
channel 1 = pixel 1 red
channel 2 = pixel 1 green
channel 3 = pixel 1 blue
channel 4 = pixel 1 white
channel 5 = pixel 2 red
etc.

The simplest overall uniform color mode is implemented like this:

mode 1: single uniform color
channel 1 = red
channel 2 = green
channel 3 = blue
channel 4 = white
channel 5 = intensity

This allows 3 channels (for RGB) or 4 channels (for RGBW) to control the color, and one channel to control the intensity. The intensity channel is in principle redundant, but makes the control much easier.

I implemented many more modes, including blinking/flashing of one or two colors, segments that can be moved over the ring (of which the color and position can be controlled), segments that automatically move around the ring (of which the color and speed can be controlled). The modes are all documented in code and in the README document included with the Arduino sketch in my Github repository.

The video below demonstrates one of the modes, controlled by the launchcontrolXL module of the EEGsynth. This shows the ESP-8266 Artnet neopixel module connected both to a 24-pixel Neopixel ring, and to a 144-pixel LED strip. I will document the hardware details of the LED strip in a follow up post.

On my YouTube channel you can find more examples, including a special Christmas tree mode 😉

Scalable lighting systems

The X-mass holiday is always a nice time of the year to spend studying and tinkering on electronics projects. In the EEGsynth project we have identified that it would be cool to control light with brain and body signals, besides controlling modular synthesizers which we have focussed on so far. As it is not yet clear what kind of light and what kind of control will conceptually and aesthetically work well on the EEGsynth control signals, I have been studying both small and large lighting systems. We might for example want to use small and wearable lights on a performer, or control the stage light, or use a LED strip as indicator of the EEG-extracted control signals.

In theatrical and stage performance lighting there is a clearly dominant standard: DMX512. For lighting setups there are many fixtures (i.e. lamps rigged on ceiling mounted truss) that can be remotely controlled over DMX512, not only on-off, but they can be dimmed, the color can be changed, spotlights can be moved, etc. If you look on for example on Thomann, you’ll see that many light fixtures support DMX.

The Disco Biscuits – City Bisco – 10/5/12 – The Mann Center for the Performing Arts – Philadelphia, PA – Photo © Dave Vann 2012

Going to the smallest systems, I considered individual LEDs. Neopixels are a very interesting type of RGB LEDs, which combine a red, green and blue (and sometimes white) LED in a single few-mm small housing together with a controller chip. The controller chip allows the individual LED intensities of the neopixels to be addressed over a serial controller by a microcontroller such as an Arduino. Furthermore, multiple Neopixels can be daisy-chained, where each pixel in the array can be addressed. LED strips consisting of 30, 60 or even 144 pixels per meter can be purchased per meter, for example on Ebay.

Adafruit NeoPixel Ring with 16 x 5050 RGB LEDs with integrated drivers

For the the EEGsynth it is desirable to have a single control module that provides a uniform interface between ExG control signals and light control. An individual neopixel can be considered as an RGB lamp, just like a theatrical stage light. The intensity of the red, green and blue can be controlled, just like the DMX channels of a stage light. Controlling a small LED jewel worn by the performer should not be different than controlling the light of the stage on which the performer acts.

An important difference in the requirements for fixed stage lighting and a small wearable LED jewel is that the first must hook up to existing DMX512 cabling systems, whereas the second should be wireless. This is where Art-Net and the ESP-8266 come in. Art-Net is a protocol for sending the DMX control protocol over a network. The ESP-8266 is a small and low-cost microcontroller combined with a WiFi chip that is compatible with Arduino.

Further details on the hardware and firmware design for the actual light controller modules will come in a series of follow-up posts.

ESP-12 bootloader modes and GPIO state at startup

Since I encountered some initial difficulties in programming the ESP-12 version of the ESP8266 module using the Arduino IDE, let me here summarise some findings based on information from [1,2,3].

esp12-pinout

The ESP-12 module exposes 11 GPIOs. Three of them are especially relevant, as they determine the bootloader mode at startup or following reset.

                                  | GPIO 0 | GPIO 2 | GPIO 15
----------------------------------|--------|--------|---------
Flash Startup (Normal)            |   1    |   1    |   0
UART Download Mode (Programming)  |   0    |   1    |   0
SD-Card Boot                      |   0    |   0    |   1

Furthermore, CHPD should be pulled up and RESET should be pulled up or should be floating. If you connect RESET to ground, the module resets.

I have not yet figured out what the SD-Card boot means, so in my applications GPIO 2 should always be pulled up and GPIO 15 should always be pulled down. I am using 10k resistors, but smaller values (e.g. 3.3k) should also work.

To facilitate development, I connected two push button switches to the GPIO 0 and RESET pins, shorting them to ground when pressed. When the buttons are not pressed, they are both pulled up to 3.3V using a 10k resistor.

This allows me to do the following two-finger-action to restart in programming mode and allow the Arduino IDE to upload a new firmware:
– press reset button
– press programming button
– release reset button
– release programming button

References

[1] https://zoetrope.io/tech-blog/esp8266-bootloader-modes-and-gpio-state-startup
[2] http://www.instructables.com/id/Getting-Started-with-the-ESP8266-ESP-12/
[3] http://www.instructables.com/id/ESP8266-Using-GPIO0-GPIO2-as-inputs/

TTN/LoRa using Dorji DRF1272F module

Teensy connected to DRF1272f

Sofar I have been experimenting with LoRa and TTN using a Multitech MDot board and with a HopeRF RFM95W module connected to a Teensy, but I decided to try something else. Franz, one of the members of the TTN Nijmegen community, started experimenting with node-to-node communication using Dorji DRF1278F 433MHz modules. I’d like to support him in converting to 868MHz, so that he can post data to TTN once a gateway become available in his range.

The Dorji modules are currently among the cheapest LoRa modules available on Ebay. So some weeks ago I ordered a DRF1272F 868MHz module for about $8, which arrived this week.

The first surprise is that it has a 1.27 mm pitch header connector. The module has 13 contacts, but not all are required for the LMIC Arduino library. To make it more easy to handle, I made a custom break-out board that connects the required pins to a 2.54 mm pitch 8-pin header. Soldering the wires at 1.27 mm pitch was quite a challenge; you may want to use a magnifying glass, as those pads are tiny!

DFR1272f module adapter board

Based on the DRF1272F datasheet, the LMIC Arduino library documentation, and the Teensy pinout I connected it as follows:

 DRF1272F  |   Teensy 3.2
--------------------------
 RESET     |   nc
 DIO0      |   2
 DIO1      |   5
 DIO2      |   nc
 DIO3      |   nc
 DIO4      |   nc
 DIO5      |   nc
 3.3V      |   3.3V
 GND       |   GND
 SCK       |   13 - SCK
 MISO      |   12 - DIN 
 MOSI      |   11 - DOUT
 NSS       |   10 - CS

Please note that I did not connect the RESET and the DIO2 pin, which would be needed for FSK.

I used the following snippet of code in my Arduino sketch to specify the pin mapping:

// Pin mapping
const lmic_pinmap lmic_pins = {
.nss = 10,
.rxtx = LMIC_UNUSED_PIN,
.rst = 9,
.dio = {2, 5, 6},
};

On the software side I am using Arduino 1.6.9, the LMIC library and the same sketch that I have been using with the RFM95W module.

I had to change the Semtech radio from SX1276 to SX1272 in the arduino-lmic/src/lmic/config.h:

#define CFG_eu868 1
//#define CFG_us915 1
// This is the SX1272/SX1273 radio, which is also used on the HopeRF
// RFM92 boards.
#define CFG_sx1272_radio 1
// This is the SX1276/SX1277/SX1278/SX1279 radio, which is also used on
// the HopeRF RFM95 boards.
//#define CFG_sx1276_radio 1

Following all of this, this node is nicely sending packets to my TTN application.

Bidirectional communication over The Things Network

I have been experimenting today with an RFM95W hooked up to a Teensy and managed to implement full bidirectional communication to/from The Things Network.

The Teensy by default sends the temperature (from a ds18b20) with every transmit. If you press the button, it sends the button press event instead. Furthermore, on every transmit it listens for a message (which can be scheduled downlong through the TTN dashboard), and blinks the led if a message is received.

The Arduino code running on the Teensy can be found here and the server application code running on the Raspberry Pi here.

Still to be done is to extend the server application code with the button (to circumvent the TTN dashboard alltogether) and to come up with an actual application that is smarter than a button and a LED. I am thinking to link both up and downlink to an IFTTT maker channel.

teensy_app2