Continuously deliver static content with Hugo, CircleCI, and AWS S3.
Using Rekognition to build collections of faces to cross identify people and parties
The cloud is great. Isn’t the cloud great? …and because you’re reading this you know about running compute instances in the cloud. If you’re unlucky enough to still be dealing with code deployed to machines instead of containers, you might be in the painful world of shipping code by automated or manual move and re-start procedures. Perhaps you are lucky enough to be in a containerized environment, but not on Kubernetes.
What is KOPS? Self described as “The easiest way to get a production grade Kubernetes cluster up and running” (on AWS (and others, see below)). KOPS looks a lot like Terraform. In broad strokes it takes cluster, context specific arguments and creates cloud resources that house and facilitate the usage of a Kubernetes cluster. KOPS Highlights Automates the provisioning of Highly Available (HA) clusters on AWS from the CLI, similar to helm or kubectl.
Metromile is a car insurance company that has been a leader in the pay-as-you-go insurance marketplace. Their rates are calculated based on two components: a base rate composed of the rating for the vehicle type (think Porsche vs Toyota Camry) and the registered address (home address) of the vehicle a per-mile rate for a driver based on their driving history / available data In order to calculate the per-mile billing, the company issues you an ODB-II adapter (more on ODB-II).
Two weeks ago I transitioned my personal site, this site, to an SSL-based, secure only site. Prior to the transition, the site was served from s3 storage with web hosting enabled. Now the site is served from CloudFront backed by an s3 origin. I had done this before for work, for clients, but never with static web hosting. During this exercise, I found a documentation gap in the process, specifically with Terraform…
I’ve spent the last few weeks at work investigating and evaluating API Gateways to drop in front of our present architecture. One of the candidates for evaluation was Amazon’s API Gateway. I had used API Gateway in the past for little things here and there, but never as a “simple” proxy layer to existing infrastructure. I set up a simple test and wrote a bunch of code to generate the necessary infrastructure and executed the tests…
If you’re reading this, you likely own a Canary and you’re likely one of the many people on twitter requesting an API from Canary. For at least a year Canary has acknowledged those requests and redirected them to product. As consumers, we have yet to see an API or any concrete movements towards one. The only interface Canary has that is close to an API are the calls their angular-based webapp makes when you login to the dashboard.
Tonight I dropped the first of a few commits/releases of a TF module aimed at pulling and cheapily storing Canary security device sensor data (temp, humidity, air quality) on AWS using Lambda and DynamoDB. Over the next few days I’ll add error handling to the API calls, add token refresh support, and an API Gateway implementation that will allow for securely querying the data. The ultimate goal is to plot the historical data on graphs, for fun.
This is the first part of a series of posts aimed at the integration of the AWS Internet of Things (IoT) service offering and the developer friendly Raspberry PI platform. These few posts will make use of the Raspberry PI’s officially supported HATs; the Sense HAT. The Sense HAT provides an 8x8 LED grid, accelerator, mag, gyro, temp, humidity, and pressure sensors. Raspbian, the officially supported, Debian-based Linux distro that commonly runs on the Raspberry PI, has the necessary Python SDKs required to use the Sense HAT.