Recover Data from HTerm

Last week I had the task to logg XBee data with my laptop using the really great terminal program HTerm. After the crucial data arrived the program hang up and was unresponsive.

The first aid was to make a screenshot, but this recovers only the latest data but not the whole log.

Without closing the unresponsive window I opened the task manager went to applications and searched for HTerm. Then choosed the option showing the process. The next step is to „Create Dump file“ (or „Abbilddatei“).

To regain access to your data you have to search the dump file for your data. In my case I searched for GPS data lines starting with „$GPS“. So I wrote a little python script scanning all lines for this specific string and saving everything in a new log-file.

# -*- coding: utf-8 -*-
"""Recovery of data from HTerm
@author: michael russwurm
"""

from __future__ import print_function

# open dump file
log_file = open("HTerm.dmp",'rb') 
recovered_data = open("recovered_data.log",'w')
for line in log_file:
    line = line.strip()
    if ('$GPS') in line:
        recovered_data.write(line+'\n')
        print (line)
    #else: # for debugging
        #print(line)
log_file.close()
recovered_data.close()
input("Press Enter to exit.")

Reading Analog Light Sensor Values via WIFI

Recently I got a Texas Instruments MSP430F5229 Launchpad and seached for an easy way to try it with my CC3000 Boosterpack. One of the first ideas was a wireless sensor node with a simple web server providing data for late analysis.

First the controller reads an analog value from an light sensitive resistor and than it provides a JSON file for download (especially for its flexibility to add or change values / parameters) which is polled. This analyisis and safe-step was done via requests and pickle python librarys.

So first to the code providing the JSON file on the web server running within the MSP430 launchpad. I programmed it for quick development in Energia and started with the example program for Simple Web Server created by Robert Wessels (who derived it from Hans Scharler).

First step is to chang the SSID and passwort to allow the connection.

#define WLAN_SSID       "myssid"
#define WLAN_PASS       "mypassword"

Then add the definition for the analog pins

int analogPin = A0;     // analog pin
int val = 0;           // variable to store the value read

The client gets the right file when browsed to xxx.xxx.xxx.xxx/data.json (check the IP adress) with

        if (currentLine.endsWith("GET /data.json ")) {
          statusConfig = 0;
          printData();
        }

within the void loop subprogram. This calls printData();.

void printData() {
  //Serial.println("Print Data");

  client.println("HTTP/1.1 200 OK");
  client.println("Content-type:application/json; charset=utf-8");
  client.println();
  val = analogRead(analogPin);    // read the input pin
  //Serial.println(val);             // debug value

  client.println("[");
  client.println("{");
  client.println("\"light_sensor\": {");
  client.print("\"value\": ");
  client.print(val);
  client.println("}");
  client.println("}");
  client.println("]");

  client.println();
}

which reads the analog sensor and provides the JSON-file to the client:

[
    {
        "light_sensor": {
            "value": 773}
    }
]

The hardware setup itself is more or less simple a 1K R is connected between ground and pin A0 and the photoresistor is connected between +5 V and A0. Than program the launchpad and supporut it with power (with attached booster pack).

Now to the server side. A simple python script polls the data using requests and exception handling i.e. timeouts (necessary because the server is not a google server…) (function read_json_data):

from __future__ import division, print_function
import requests, json
import time
import datetime
import numpy as np
import pickle
import socket

## Read Light Measurement Data from IP via JSON-File
# Michael Russwurm 2014
# under GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007

print("# Gathering Light Sensor Measurements #")
print("Started at ", datetime.datetime.now())

def read_json_data(url_string, time_out):
    """Reads Data from url string in JSON format.\n
    Returns False if not working properly or JSON object without errors.
    time_out in seconds defines request duration.
    Imports needed: json, requests and socket.
    """
    exception = False
    try:
        req = requests.get(url_string, timeout=time_out)
        # print( r.json()
        #print(r.encoding)
        #print(r.text)
    except requests.exceptions.RequestException as requests_error:
        print ("Error: ", requests_error)
        exception = True
    except socket.error as socket_error:
        print("Error: ", socket_error)
        exception = True
    if exception is False:
        data = json.loads(req.content)
        data = data[0] # data[0][...][...] not necessary anymore
    else:
        data = False
    return(data)

COUNT = 100 # x100
for j in range(COUNT):
    data_array = []
    for i in range(100):
        # change IP adress HERE, 30 s timeout
        data_point = read_json_data('http://xxx.xxx.xxx.xxx/data.json', 30) 
        time.sleep(5)
        if data_point is not False:
            data_array.append([datetime.datetime.now(),
                               data_point["light_sensor"]["value"]])
            #print("Light Sensor: ",data_point["light_sensor"]["value"],
                               #" No. ",(i+1))
        else:
            #print("Light Sensor: ERROR "," No. ",(i+1))
            time.sleep(30)
    print("Try to append to existing file")
    try:
        data_array = np.append(pickle.load(open("light_sensor_data.p", "rb")),
                         np.array(data_array), axis=0)
        print("Existing file found - append to existing file")
    except IOError:
        data_array = np.array(data_array)
        print("No existing file found - creating new one")
    pickle.dump(data_array, open("light_sensor_data.p", "wb"))
    print("Saved at ", datetime.datetime.now())
print("Finished")

A for loop polls the site every 5 seconds and converts integer number and appends it together with an timestep (datetime.datetime.now()) to a list ( data_array ). After 100 values the whole array is safed in a new file using pickle to safe file size. A if/else condition checks if there is already a file and would append it. Another advantage of this approach is that it is possible to restart the programm with a max. loss of 100 measurement points.

Last but not least another script reads the data and plot it:

from __future__ import print_function, division
import pickle
import matplotlib.pyplot as plt
from matplotlib import dates

## Print Light Sensor Data
# Michael Russwurm 2014
# under GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007

data_array = pickle.load(open("light_sensor_data.p", "rb"))
print("# Plot Light Sensor Data #")
#print(data)

# matplotlib date format object
hfmt = dates.DateFormatter('%d.%m.%y %H:%M')
fig, ax = plt.subplots(1, 1)
# 12 bit = 4095, 3.3 V
ax.plot(data_array[:, 0], data_array[:, 1]/4095*3.3, "--o", label="Brightness")
ax.grid()
ax.legend(loc="lower right")
ax.set_ylabel("Brightness Sensor Output [V]")
ax.set_xlabel("Time")
ax.set_title("Light Sensor Data")
ax.xaxis.set_major_locator(dates.AutoDateLocator())
ax.xaxis.set_major_formatter(hfmt)
plt.gcf().autofmt_xdate()
plt.show()

This piece of code is also responsible of converting the AD-values to voltage values (lux conversion would be possible wiht calibration).

The result looks somthing similar to this:
Light Sensor Data

This system is easy adaptable to more than one sensor or even more than one sensor node (although not very cheap).

Wurst-Index

Rußwurm, K. und Rußwurm, M. (2014). A Sunday afternoon-study for the new proposed “Wurst-Index”. Wurm-Verlag. Rabenstein.

Diese Kurz-Studie dient zur Beantwortung der Frage, ob es einen Zusammenhang zwischen dem Voting-Verhalten europäischer Songcontest-Zuseher und der menschenrechtlichen Situation von Homosexuellen, Bisexuellen und Transgender-Personen im jeweiligen Land gibt. Operationalisiert werden diese beiden durch den ILGA-Europe Rainbow Index [1] und den Wurst-Index. Der ILGA-Index

rates each European country’s laws and administrative practices according to 42 categories and ranks them on a scale between 30 (highest score: respect of human rights and full legal equality of LGBT people) and -12.

 

Der “Wurst-Index” stellt eine innovative Form der Akzeptanz von bärtigen Frauen, die singen auf einer Skala von 0 bis zu 12 Punkten dar, erhoben in 37 Ländern [2] per Auto-Initiativ-Telefonumfrage wobei 50 % der Wertung durch lokale ExpertenInnenkomissionen (verblindet da sie nicht in Kenntnis von der groß angelegten wissenschaftlichen Studie waren) beigetragen wurde. Der soziale Impact kann mit 120 Millionen ZuschauerInnen als extrem hoch eingeschätzt werden.
Ergebnis: Die beiden Indizes korrelieren mit einem r=0,614 (Punkte), r=-0,517 (Jury) und r=-0,448 (Televoting) hoch miteinander und der Zusammenhang ist signifikant (p=8,72e-5, p=0,0018, p=0,0090), der weitere Vergleich mit anderen Indizes ist Feld weiterführender Forschungsarbeit.

Abb.1 Points / ILGA

 

In Abb.1 ist der ILGA im Zusammenhang zur Punkteanzahl (blau linearer fit, grün quadratischer fit) zur Veranschaulichung der Resultate des finalen Durchgangs der Befragung angewendet worden.

Abb.2 ILGA / Jury Platzierung (1-25)

Abb.2 zeigt den ILGA im Vergleich zum Jury-Ranking (blau linearer fit, grün quadratischer fit), wobei hier Georgien fehlt da hier nur das Publikum die Punkte vergeben hat.

Abb.3 ILGA / Televoting Platzierung (1-25)

Abb.3 zeigt den ILGA im Vergleich zum Televoting-Ranking (blau linearer fit, grün quadratischer fit), wobei hier San Marino fehlt da hier nur die Jury die Punkte vergeben hat.

Allgemein fehlt in den Abbildungen Israel da hier keine Daten  zum ILGA vorhanden sind.

Quellenangabe:
[1] ILGA Europe. (2012). Rainbow Europe Map and Index. Abgerufen 11. Mai 2014, von http://www.ilga-europe.org/home/publications/reports_and_other_materials/rainbow_europe_map_and_index_may_2012
[2] European Broadcasting Union. „Eurovision Song Contest 2014 – Scoreboard“. Zuletzt besucht 11. Mai 2014.
http://www.eurovision.tv/page/history/by-year/contest?event=1883#Scoreboard

pyrocket – Amateur Rocket Simulation Script

A small little python script for simulation of an ameteur rocket.

https://github.com/Lageos/pyrocket

It is a one dimensional simulation (acceleration, velocity and altitude), which considers drag changing temperature and density over altitude, changing mass and thrust.

Example Output of Amateur Rocket
Example Output of Amateur Rocket

It also simulates the optimum spearation time for two stage rockets. This is kind of arbitrary cause it is alway (except the drag difference is really huge) always best to use the inertia of the accelerated mass as long as possible in an unpropelled state to work against the drag.

One additional gimnick is the necessary angle of an autotracking device separated from the launchpad.

Rotary Encoder Disks with Matplotlib

Did you ever needed a quick way to make your own rotary encoder disks?

Here a quick way to make the graphics. Afterwards only a printer and scissors are necessary.

First import the packages numpy and matplotlib

import matplotlib.pylab as plt
import numpy as np

And now define the outer and inner diameter as well as the number of lines:

di = 300.e-3# inner diameter
da = 340.e-3 # outer diameter
lpr = 720. # lines per 360°

And here comes the rest:

fig, ax = plt.subplots(1,1, subplot_kw=dict(polar=True))
theta = np.linspace(0.,2*np.pi,lpr+1)
radii=np.empty(lpr+1); radii.fill((da-di)/2)
dis=np.empty(lpr+1); dis.fill(di/2)
ax.bar(theta, radii, width=np.radians(360/(2*lpr)), bottom=dis,
           color='black', edgecolor = 'none', linewidth  = 0.)
ax.plot(0,0,'.',color='black')
ax.xaxis.set_visible(False)
ax.yaxis.set_visible(False)
fig.patch.set_visible(False)
ax.axis('off')

It is a polar plot with bars, separeted from inner diameter to outer diameter. Important is to suppress any lines which disturb your signals (suppress tick and labels).

And here is an example output:

Rotary Encoder Disk
Rotary Encoder Disk

Only thing what’s left is to print and build.

Install Anaconda and IPython (notebook) in Ubuntu

Sometimes it is either hard to remember or there are many way to fulfill a certain goal. During my (re-)setup of my laptop with Ubuntu 12.04 I had to install IPython with its notebook capabilities.

The first step is to download the recent anaconda package from:
http://continuum.io/downloads
Either the 32 or 64 bit version. Next make it executable either by right click on the downloaded file and choose „Properties“ then the tab „Permissions“ or with an editor

sudo chmod +x Anaconda-1.x.x-Linux-xxx_xx.sh

.

Next start the installation by typing

./Anaconda-1.x.x-Linux-xxx_xx.sh 

. Choose the recommended options. (Be careful not to hit the enter key to often at beginning otherwise it kicks you out.)

This installs the necessary files, but we need to add the directory to the PATH. Therefore we change the the profile configuration by

gedit ~/.profile

and change the line

PATH="$HOME/bin:$PATH"

to

PATH="$HOME/bin:$PATH:~/anaconda/bin"

. Afterwards only safe the file. EDIT: It is also possible to restart or logout / login to update the PATH.

Next -but not strictly necessary- it is always favorable to update the existing files. To do this we can use the anaconda built in conda repository management system.
Type

conda update ipython

Now you’re read to start the IPython notebook everywhere (although it is recommended to start it always from the same directory, otherwise it isn’t possible to open the old files directly with the notebook manager).

 ipython notebook 

How to use TeXmaker with subfolder build option and biber

Im working with Texmaker and use the option „use a „build“ in subdirectory“ to get a tidy folder.

\usepackage[style=numeric,backend=biber]{biblatex}
\addbibresource{bib/bib_example.bib}

My problem was to parse the relative path to biber. So i got an error of:

ERROR - Cannot find control file 'example.bcf'! 
- did you pass the "backend=biber" option to BibLaTeX? 
INFO - ERRORS: 1 

The solution is to change the settings of Texmaker in the field of „Bib(La)Tex“ to

"C:/Program Files/MiKTeX 2.9/miktex/bin/x64/biber.exe" build%