The Idea

In this episode, we will be building two Raspberry Pi’s with light and thermal sensors attached for data logging. We will process the data logs with the goal of automatically tracking my work and sleep habits. This episode is a mashup of hardware, programming, and some data analysis with a text editor, SQL, and spreadsheets.

For those new to my here, at the start of 2019, I started my own business via this website, where I am releasing weekly screencasts on Sysadmin / DevOps topics. So, the high-level goal of this data logging project is to track my work and sleep habits over many months based on my presence in particular rooms. I want to do this, since starting a new business can be really stressful, and I wanted to track how my hours worked and sleep habits are affected.

Why and How it Works

Let me explain how the data logging works. Say, I am in my office and have one of these Raspberry Pi based sensors mounted on the wall. Well, this is a stock image of what my office might look like. So, on one of the sensors, I am capturing how much visible light there is in the room, for state like, hey are the light on or off? Then, on the second sensor, I am also capturing thermal data for a bunch of points in that room.

I can then use this data to capture state like, okay, the lights are on but is someone actually in the room? It would look something like this, if you were to project, or overlay, the sensors thermal detection grid onto the room. So, in each of these 64 cells, in this 8x8 grid, we can accurately measure and record the temperature readings for those cells. For example, if I were sitting at my desk, you would see much higher temperatures in some of these cells. Where orange here indicates a higher temperature and the blue is colder.

Then, in the Raspberry Pi, all I am doing is continually logging these sensor readings at 1 event per minute. In a day, I am collecting 1440 event readings, broken down into 60 events per hour, times 24 hours. Based off this logged data, I can then create all types of graphs to tell me, were the light on in the office, and was someone physically in that office? Then, I can use this to roughly guess my total hours worked that particular day, but also the specific start and end times, break times, and get trends across weeks or months.

To give you an idea of what this thermal data actually looks like when visualized. Check this out. Here is a quick data capture of me walking into my office and waving at the sensor. You can see it is totally pixelated as this is only 64 data points but is really useful for my needs. Having less data is actually good here. In that, I was originally thinking of using something like OpenCV for object detection, or maybe creating an ML model, but it turns out, I can just check a few specific pixels, to see if the temperature is high enough, and use that to determine if I was in the room. This is super simple and works really well without being too complex. Obviously, this is highly tuned for my specific use-cases, and rooms, as I know the layout of where I will be and trigger off that data.

So, that covers my office, but I also have the same sensor in my bedroom too. For this second sensor, I also have a long USB cable and LED breakout board attached to it. I am using this as a type of smart nightlight in addition to all the data logging. Basically, the added nightlight functionality can detect if the lights are off, and I get up at night, flip the nightlight on so I can see where I am walking. Let me explain. So, this is my bedroom. Well, this is a stock image of what my bedroom might look like. Again, the sensor is continually checking, and logging, data from these points in the room. I can use this to track my sleeping patterns too based off room presence.

You can see when I am there based of temperature or thermal data (even when the lights are off). But, for this added nightlight feature, I can say, if the lights are off, and a heat value in these cells are consistent with someone being there, then trigger the LED light for a period of time. This nightlight feature, was sort of a after thought, but totally works really well and can turn the light on within 100ms of me being detected. What I like about this, is that it is totally passive from my perspective, this sensor is just sitting in the room and collecting data. I am sure you could image all sorts of use cases besides what I am doing here.

The Details

Alright, so now that you know what the high-level device looks like and my use-cases. Let me walk you through this in a little more detail. By the way, all the code for this is up on my Github, and a detailed build log will be in the episode notes below, if you wanted to replicate this.

First, I am using the Raspberry Pi Zero wireless model. This is a super small Linux computer about the size of half a credit card that supports Wifi. The picture here does not do this justice of how small this thing is. It is tiny. I purchased a kit that comes with Raspberry Pi Zero, a case, and a power adapter. You will also need a micro SD card for installing the base OS and detection software. You will also need all types of tiny adapters here for micro HDMI, micro USB to USB, for keyboard and mouse. This cost a few extra bucks since I did not have these cables around but, you only need power once you have all this configured, as you can just connect with SSH over your network.

Next, I am attaching a light sensor that will return a value for how much light it senses. This just returns a float value and then you can log that. I purchased the TSL2591 sensor from Adafruit and it works seamlessly with the Raspberry Pi using the I2C BUS Pins when you solder it up. This sensor gives you a few values where you can calculate the amount of lux visible to the device. The wikipedia page says lux is the derived unit for measuring the luminous flux in an area. Basically, I just think of it as the amount of light the sensor can see, and you will get a float value out of it. For example, when the light are off, I get a value of zero, and when they are on the value is around 30. So, it is super easy to log and plot this data.

Next, we have the thermal sensor which grabs 64 temperature values, in an 8x8 grid, at up to a distance of around 20 feet. I also purchased this AMG8853 sensor on Adafruit and connected it to the Raspberry Pi using the I2C BUS Pins too. The Raspberry Pi is an awesome platform for playing around with this type of stuff. There are tons of sensors you can connect to if on this website if you browse around. Pretty much the sky’s the limit. The thermal sensor here, works at 1 frame per second or 10 frames per second. For my office, I am just running this in 1 frame per second and logging the data at one event per minute. For the bedroom sensor, I am running this at 10 frame per second, since I want to trigger the nightlight quickly, but am only logging data at one event per minute. You can think of this like a thermal camera, like a FLIR, but with very limited resolution of only 64 pixels. You can read all about the theory on the wikipedia page here. A cool story, was that we used to have one of these high resolution FLIR cameras at work, can you could touch your hand on the table and remove it. Then, points the FLIR camera at that area and clearly see you hand print, since it raised the temperature slightly. Was pretty awesome.

Alright, next, on the bedroom sensor, I attached this LED breakout board for the nightlight logic. I had a few of these LED boards around from a previous project and they are really cool. It is called a Blinkstick, and basically you just connect this via USB, and then you can program it to do all types of cool things. I was using it for notification alerts and things like this. They cost about 15 bucks and work great for giving visual indications from automated programs. Supports lots of programming languages too.

All of this sensor data is being captured on the device and logged into a local file. I have had these devices running for about 3 months now. When I want to look at the data, I just scp the log file out and process it. But, at some point I want to have this data streamed out and automate the analysis process. I also wrote a simple web server on the Raspberry Pi, that sits on the local network, and gives you a real-time view of what these sensors are seeing. I used this heavily during the development to quickly debug things. For example, I used this to check the light values with the lights where turned on and off. Also, to make sure the thermal sensor was actually pointed to where I was in the rooms, and sort of see the bounds of the viewing area. The code for this is posted up on Github. I wrote all this in Go. This did create a problem though, as the thermal sensor did not have a driver written in Go, so I ported the Python driver into Go. You can download the driver too if you wanted to recreate this exactly. Go is my personal programming language of choice right now, and I had never worked with the I2C stuff in Go before, so this part was more of a learning exercise than anything.

Lets take a look at the web interface for a minute. So, this is a live reading of what the office sensor is seeing right now. You can see the timestamp, the lux, this is the visible light in the room, and then down here we have the thermal sensor data plotted out. There is actually this cool project that allows you to run gnuplot in javascript and that is what is plotting this image. So, you can see all the values down here and you can change them live. Having this web server, greatly helped with debugging too as I could adjust things on the fly. For example, changing the position of the sensors to get good readings. I was mostly using this on my phone as I was walking around and reloading the page to check things out. This is how I was able to find the pixels of where I was sitting in the office for all the detection logic. I did the same thing in my bedroom for detecting the pixels for triggering the nightlight.

Finally, lets take a look at the logged data. I originally was not sure how I wanted to visualize this data so I just logged it in JSON format. Turns out you can import raw JSON into BigQuery and it will auto detect the schema and create your tables for running SQL on it. I was using BigQuery since that is what I am used to using for data analytics stuff. But, you could honestly use whatever you wanted since this is just JSON.

{"Timestamp":"2019-01-02T18:57:49.730148975-08:00","Pixels":[20.5,22.75,21.75,20,20,19.5,19.75,20.25,21.25,21.25,21.25,20.5,19.5,19.25,19.5,19.75,23,21.5,20,19.5,19.25,19.5,19.25,19.75,19.5,19.75,19.75,19.25,19.5,19.5,19.25,20,19.75,19.5,19.75,19.75,19.75,19.5,19.75,20.5,19.75,19.75,20,20,19.5,19.75,19.75,20.5,19.5,20.25,20,20,19.75,19.75,20.25,20.5,20.25,20,20,20.25,19.75,20,20.5,20.75],"Lux":0} {"Timestamp":"2019-01-02T18:58:49.73005184-08:00","Pixels":[20.75,22.75,22,20,20,19.75,19.75,20.5,21.25,21.5,21.25,20.75,19.5,19.5,19.75,19.75,23,21.5,20.25,19.75,19.5,19.75,19.5,20,19.75,19.75,20,19.75,19.75,19.75,19.75,20,19.75,19.75,20,20,20,19.75,19.75,20.75,20,20,20.25,20.25,19.75,19.75,20,20.5,19.75,20.5,20,20.5,19.75,19.75,20.25,20.5,20.25,20.25,20,20.25,19.75,20,20.5,20.75],"Lux":0} {"Timestamp":"2019-01-02T18:59:49.730060028-08:00","Pixels":[20.5,22.5,21.5,19.75,19.75,19.5,19.5,20,21.25,21.25,21,20.5,19.5,19.25,19.5,19.5,22.75,21.25,19.75,19.25,19.25,19.5,19,19.5,19.5,19.25,19.5,19.25,19.25,19.5,19.25,19.5,19.5,19.5,19.5,19.5,19.75,19.25,19.5,20.25,19.75,19.5,19.75,19.75,19.5,19.75,19.75,20.25,19.5,20,19.75,20,19.5,19.5,20,20,19.75,19.75,19.75,19.75,19.5,19.75,20.25,20.5],"Lux":0} {"Timestamp":"2019-01-02T19:00:49.730048395-08:00","Pixels":[20.75,22.75,21.75,20.25,20,19.75,19.75,20.5,21.25,21.5,21,20.5,19.5,19.5,19.75,19.75,22.75,21.5,20.25,19.75,19.5,19.5,19.5,19.75,19.75,19.5,19.75,19.75,19.75,19.75,19.5,20,19.75,19.75,19.75,19.75,20,19.75,19.75,20.5,20,19.75,20,20,19.75,20,20,20.5,19.75,20.25,20.25,20.25,19.75,20,20.25,20.25,20,19.75,20,20,19.75,20,20.25,20.5],"Lux":0} {"Timestamp":"2019-01-02T19:01:49.730062809-08:00","Pixels":[20.75,22.5,21.75,20,20,19.75,19.5,20,21.25,21.25,21.25,20.5,19.5,19.5,19.75,19.75,22.75,21.5,20,19.5,19.25,19.5,19.5,19.5,19.75,19.5,19.75,19.5,19.5,19.75,19.5,20,19.75,19.5,19.75,19.75,20,19.5,19.75,20.5,20,19.5,20,20,19.5,19.75,19.75,20.5,19.5,20.25,20,20,19.5,19.75,20,20.25,19.75,19.75,20,20,19.75,20,20.25,20.75],"Lux":0} {"Timestamp":"2019-01-02T19:02:49.730391152-08:00","Pixels":[20.75,22.5,21.75,20.25,19.75,19.5,19.5,20.25,21,21.25,21,20.5,19.5,19.25,19.5,19.75,22.5,21.25,20,19.75,19.25,19.5,19.25,19.5,19.75,19.5,19.75,19.5,19.5,19.5,19.25,19.75,19.5,19.5,19.5,19.5,19.75,19.75,19.5,20.5,20,19.5,20,19.75,19.75,19.5,19.75,20.25,19.75,20.25,20,20,19.75,19.5,20.25,20.25,20,19.75,20,20,19.75,19.75,20.25,20.75],"Lux":0} { "Timestamp": "2019-01-02T19:03:49.730047329-08:00", "Pixels": [ 20.75, 22.5, 21.75, 20, 20, ... 19.75, 20, 20.25, 20.75 ], "Lux": 32.4141 }

Here is what the log file actually looks like. You can see the timestamp data, the thermal pixel reading data, and then the lux value. That is basically it. Personally, I like analyzing the data using SQL since you can do all types of grouping and sum calculations with ease.

Data Analysis and The Results

From here, I created a few charts. First, I just wanted to see if I could get the lux data plotted across time. Next, I was asking myself okay, can I determine if lux is over a specific value, say 30, can I determine if the lights are on for this period of time, then group this into minutes, and then get a sum of how many hours per day where the lights on. Basically, tell me how many hours per day were the lights on in the office. Next, lets do that same thing for the thermal data, count how many hours per day was a human detected in that room (based on temperature readings for a few specific pixels in that grid). Next, lets plot both the lights on data, and thermal data, into a single chart. This is still pretty early, as I just started looking at the data after logging for a few months, but it mostly works and is good enough for what I need.

So, lets look at the data in BigQuery, along with the spreadsheets, as I have this stuff plotted out. I have posted all the SQL queries in the episode notes below too if you wanted to replicate this. By the way, BigQuery is Google’s data warehouse product. You can literally process Petabytes of data with this thing very quickly. So, this is totally overkill for my needs, as I only have 50 megs of data here, but I was just familiar with it and it works the same weather you have Petabytes or Megabytes.

I just imported the raw JSON data into BigQuery and it created this 2019_01_02_office table. You can see the schema here with our, timestamp, lux, and thermal pixel data. Here is what a preview of that data looks like. What is cool about BigQuery, is that it allows you to store a struct or array natively inside the table, so I can actually query a specific thermal pixel using SQL.

#StandardSQL SELECT EXTRACT(MONTH FROM Timestamp) AS Month, EXTRACT(DAY FROM Timestamp) AS Day, TRUNC(SUM(lux)) AS light FROM `snappy-203307.tmp.2019_01_02_office` GROUP by Month, Day ORDER BY Month, Day

This is the query for getting a total lux sum per day. You can see the results down here. This was more of a building block than anything else. Then I downloaded this a CSV and imported it into a spreadsheet and graph it. You can also export the results directly to a spreadsheet but I wanted to explore this data in my text editor too. So, this workflow worked for me. This was more of a proof of concept as I was exploring the data.

#StandardSQL SELECT FORMAT_TIMESTAMP("%Y-%m-%d %H:00", Timestamp) as event_date, TRUNC(SUM((SELECT SUM(x) FROM UNNEST(s.Pixels) x))) AS thermal, TRUNC(SUM(lux)) AS light FROM `snappy-203307.tmp.2019_01_02_office` s GROUP BY event_date ORDER BY event_date asc

Next, lets run a similar query for the lux and thermal data. You can see the query results down here. I did the same thing, exported the data as CSV and them imported it into a spreadsheet and graph it.

But, this was not really that useful for getting what I wanted. I wanted to get the hours per day that I was actually in that room. So, after a bit of trial and error, I finally came up with a SQL query that did what I was looking for. Basically, get all the events where the light value was over 30, this indicated the lights here on, then group those by hour, and give me a sum per day of how many hours the lights were on. You can see the results here. Then lets go and graph it.

#StandardSQL WITH events AS (SELECT FORMAT_TIMESTAMP("%Y-%m-%d %H:%M", Timestamp, "-08:00") as event_date, TRUNC(SUM(lux)) AS light, CASE WHEN TRUNC(SUM(lux)) >= 30 THEN 1 WHEN TRUNC(SUM(lux)) < 30 THEN 0 ELSE 0 END AS lights_on FROM `snappy-203307.tmp.2019_01_02_office` GROUP by event_date ORDER BY event_date asc) SELECT FORMAT_DATETIME("%Y-%m-%d", PARSE_DATETIME('%Y-%m-%d %H:%M', e.event_date)) as event_date, TRUNC(SUM(e.lights_on) / 60) AS lights_on FROM events e GROUP by event_date ORDER BY event_date asc

I was looking at the data periodically on my phone using the web server but only looked at this data after about 3 months using SQL. So, you can you see here there is a bunch of low values in here.

I was definitely working longer hours during this time as I just launched my subscription service. My thinking, is that this is where I was testing some acoustic panels in my office, I was trying to get a really quite space for recording audio, and I was obstructing the view of the sensor a little and it logged data at smaller values than what I am currently looking for in the SQL queries. So, I might need to amplify this window of data a bit. But, I am still thinking about if I even care about it, since it is working correctly now and what does that extra work actually get me. Anyways, you can now clearly see I can totally get the hours per day the light was on. Which is pretty awesome.

#StandardSQL WITH events AS (SELECT FORMAT_TIMESTAMP("%Y-%m-%d %H:%M", Timestamp, "-08:00") as event_date, TRUNC(SUM(lux)) AS light, CASE WHEN TRUNC(SUM(lux)) >= 30 THEN 1 WHEN TRUNC(SUM(lux)) < 30 THEN 0 ELSE 0 END AS lights_on, CASE WHEN TRUNC(SUM(Pixels[ORDINAL(1)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(2)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(3)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(4)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(5)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(6)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(7)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(8)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(9)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(10)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(11)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(12)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(13)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(14)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(15)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(16)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(17)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(18)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(19)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(20)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(21)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(22)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(23)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(24)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(25)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(26)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(27)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(28)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(29)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(30)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(31)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(32)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(33)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(34)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(35)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(36)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(37)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(38)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(39)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(40)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(41)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(42)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(43)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(44)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(45)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(46)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(47)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(48)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(49)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(50)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(51)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(52)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(53)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(54)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(55)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(56)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(57)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(58)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(59)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(60)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(61)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(62)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(63)])) >= 23 OR TRUNC(SUM(Pixels[ORDINAL(64)])) >= 23 THEN 1 ELSE 0 END AS thermal_on FROM `snappy-203307.tmp.2019_01_02_office` GROUP by event_date ORDER BY event_date asc) SELECT FORMAT_DATETIME("%Y-%m-%d", PARSE_DATETIME('%Y-%m-%d %H:%M', e.event_date)) as event_date, TRUNC(SUM(e.lights_on) / 60) AS lights_on, TRUNC(SUM(e.thermal_on) / 60) AS thermal_on FROM events e GROUP by event_date ORDER BY event_date asc

Next, lets do the same thing for the light and thermal data together. Basically, counting how many hours per day were the lights were on and someone was home in the office. You can see a results preview here, again, I downloaded a CSV and imported into a spreadsheet and graph the data.

Now, we can see the lights on per day graph, thermal detection events per day graph, and an overlapping graph of both datasets. This is where I am at today with this project, and I probably need to tune this a little better, but it mostly works as a proof of concept. But, this allows me to track my work and sleep habits. What I am not showing here, is that I ran all this for the bedroom sensor too, and can now plot all my sleep schedule over the past few months. Obviously, this does not tell you about the quality of work and sleep but it can act as a guide.

Future Work

So, what next? Well, I want to create some type of dashboard that will show me this data automatically in real-time. It would be nice actually, if I were to get an email where I could sort of log what I was working on the previous day, hours worked, I can input what was the quality of work, and what the quality of my sleep was. This might give some useful pattern over a few months.

Also, I want to get something where it tells me the specific hours worked, into some type of 24 hour heatmap, vs just grouping data into totals like this. This would tell me what times I typically work, sleep, etc. I know many of these hours are in the later day and early morning. So, that would be nice to see to visually see my working and sleeping patterns. What I like about all this is, it is totally passive and just logs the data for me. I also like this as I have total control over it for privacy reasons.

Anyways, hopefully this was interesting and if you are looking to do something similar, this might work as an initial prototype. You can find a complete build log below with all the part numbers, links to code, and all that.

Alright, that’s it for this episode. Thanks for watching and I will see you next week. Bye.

Build Log

Parts list:

Configure the Pi

Flashing the disk image

~ diskutil list /dev/disk0 (internal, physical): #: TYPE NAME SIZE IDENTIFIER 0: GUID_partition_scheme *251.0 GB disk0 1: EFI EFI 209.7 MB disk0s1 2: Apple_CoreStorage Macintosh HD 250.1 GB disk0s2 3: Apple_Boot Recovery HD 650.0 MB disk0s3 /dev/disk1 (internal, virtual): #: TYPE NAME SIZE IDENTIFIER 0: Apple_HFS Macintosh HD +249.8 GB disk1 Logical Volume on disk0s2 9A23B464-28B6-470D-ABBB-95186EE99C05 Unencrypted

~ diskutil list /dev/disk0 (internal, physical): #: TYPE NAME SIZE IDENTIFIER 0: GUID_partition_scheme *251.0 GB disk0 1: EFI EFI 209.7 MB disk0s1 2: Apple_CoreStorage Macintosh HD 250.1 GB disk0s2 3: Apple_Boot Recovery HD 650.0 MB disk0s3 /dev/disk1 (internal, virtual): #: TYPE NAME SIZE IDENTIFIER 0: Apple_HFS Macintosh HD +249.8 GB disk1 Logical Volume on disk0s2 9A23B464-28B6-470D-ABBB-95186EE99C05 Unencrypted /dev/disk2 (internal, physical): #: TYPE NAME SIZE IDENTIFIER 0: FDisk_partition_scheme *16.0 GB disk2 1: Windows_FAT_32 NO NAME 16.0 GB disk2s1

~ diskutil unmountDisk /dev/disk2 Unmount of all volumes on disk2 was successful

~ sudo dd bs=1m if=/Users/jweissig/Downloads/2018-11-13-raspbian-stretch-full.img of=/dev/rdisk2 conv=sync 5052+0 records in 5052+0 records out 5297405952 bytes transferred in 476.447255 secs (11118557 bytes/sec)

~ sudo diskutil eject /dev/rdisk2 Disk /dev/rdisk2 ejected

~ diskutil list /dev/disk0 (internal, physical): #: TYPE NAME SIZE IDENTIFIER 0: GUID_partition_scheme *251.0 GB disk0 1: EFI EFI 209.7 MB disk0s1 2: Apple_CoreStorage Macintosh HD 250.1 GB disk0s2 3: Apple_Boot Recovery HD 650.0 MB disk0s3 /dev/disk1 (internal, virtual): #: TYPE NAME SIZE IDENTIFIER 0: Apple_HFS Macintosh HD +249.8 GB disk1 Logical Volume on disk0s2 9A23B464-28B6-470D-ABBB-95186EE99C05 Unencrypted

Setting up the device

boot up into X configure password, timezone, networking, install updates, and reboot ctrl + alt + 1 login using "pi" and password sudo su - ifconfig free -m uptime uname -a df -h ps auxf netstat -nap |grep LIST https://www.raspberrypi.org/forums/viewtopic.php?t=74341 change the run level raspi-config > 3 (boot options) > B1 (desktop / cli) > B1 (console) raspi-config > 3 (boot options) > B3 (splash screen) > No (don't show) enable ssh raspi-config > 5 (interfacing options) > P2 (SSH) enable i2c raspi-config > 5 (interfacing options) > P5 (I2C) reboot login pi sudo su - apt-get install vim disable ipv6 https://www.raspberrypi.org/forums/viewtopic.php?t=138899 add "ipv6.disable=1" to /boot/cmdline.txt add "backlist ipv6" to /etc/modprobe.d/ipv6.conf verify with ifconfig & lsmod http://blog.mmone.de/2017/05/16/raspberry-pi-zero-w-disable-bluetooth/ disable bluetooth /boot/config.txt dtoverlay=pi3-disable-bt systemctl disable hciuart systemctl disable bluetooth systemctl disable bluealsa disable led https://www.jeffgeerling.com/blogs/jeff-geerling/controlling-pwr-act-leds-raspberry-pi # Disable the ACT LED on the Pi Zero. dtparam=act_led_trigger=none dtparam=act_led_activelow=on misc systemctl disable sound.target systemctl disable nfs-client.target systemctl disable remote-fs-pre.target systemctl disable remote-fs.target

Configure I2C for AMG8833

I2C Addresses https://learn.adafruit.com/i2c-addresses?view=all TSL2591 light sensor (0x29 only) AMG8833 IR Thermal Camera Breakout (0x68 or 0x69)

Configure I2C for TSL2591

Adafruit TSL2591 High Dynamic Range Digital Light Sensor https://www.adafruit.com/product/1980 https://learn.adafruit.com/adafruit-tsl2591?view=all

Install I2C tooling

https://learn.adafruit.com/adafruits-raspberry-pi-lesson-4-gpio-setup/configuring-i2c https://github.com/adafruit/Adafruit_CircuitPython_TSL2591/blob/master/examples/tsl2591_simpletest.py https://github.com/adafruit/Adafruit_CircuitPython_TSL2591 https://learn.adafruit.com/adafruits-raspberry-pi-lesson-4-gpio-setup/configuring-i2c Configuring I2C apt-get install python-smbus i2c-tools (already installed)

Check if devices are installed

pi@raspberrypi:~ $ sudo i2cdetect -y 1 0 1 2 3 4 5 6 7 8 9 a b c d e f 00: -- -- -- -- -- -- -- -- -- -- -- -- -- 10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 20: -- -- -- -- -- -- -- -- -- 29 -- -- -- -- -- -- 30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 70: -- -- -- -- -- -- -- --

Configure second I2C bus

https://lb.raspberrypi.org/forums/viewtopic.php?t=144719 /boot/config.txt dtparam=i2c_vc=on root@raspberrypi:~# i2cdetect -l i2c-1 i2c bcm2835 I2C adapter I2C adapter i2c-0 i2c bcm2835 I2C adapter I2C adapter root@raspberrypi:~# i2cdetect -y 1 0 1 2 3 4 5 6 7 8 9 a b c d e f 00: -- -- -- -- -- -- -- -- -- -- -- -- -- 10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 20: -- -- -- -- -- -- -- -- -- 29 -- -- -- -- -- -- 30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 70: -- -- -- -- -- -- -- -- root@raspberrypi:~# i2cdetect -y 0 0 1 2 3 4 5 6 7 8 9 a b c d e f 00: -- -- -- -- -- -- -- -- -- -- -- -- -- 10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 70: -- -- -- -- -- -- -- --

Log data using sense program from github

nohup ./sense >> 2019-01-02-office.json &