Some notes on how I’ve been processing recent mars images.

Some notes on how I’ve been processing recent mars images.
I came across an odd problem getting my shiny new IC-7300 working for digital modes using Omnirig.
I set the filter to FIL1, the default widest filter, and the software (JTDX in this case) would transmit correctly. But after the transmission period when it went back to receive it would activate the narrower FIL2.
Here’s what’s going on…
Omnirig sends this instruction to the ICOM to go back into receive:
[pmDIG_U]
; These lines select USB-D for USB digital mode
Command=FEFE94E0.2600.01.01.FD
ReplyLength=15
Validate=FEFE94E026000101FD.FEFEE094FBFD
And according to the ICOM manual that should set the filter back to the “default filter”.
Trouble is nowhere in the manual can I see how to actually set the default filter for the mode.
So for now the best solution is to have Omnirig explicitly set us back to filter 1. I could just change FIL2 to be wider but then the filters get more confusing. Having Omnirig explictly set a filter when all I want the radio to do is go back revieve isnt ideal, but it works. To do this you’ll need to change this section in the IC-7300-DATA.ini Omnirig file:
[pmDIG_U]
; These lines select USB-D for USB digital mode
Command=FEFE94E0.2600.01.01.01.FD
ReplyLength=16
Validate=FEFE94E02600010101FD.FEFEE094FBFD
This is identical to the original except we explicitly set the filter as per the manual and update the replylength and validate string accordingly.
pip installing jupyterlab may give you the following error (somewhere in the error output anyway!):
1 2 3 4 5 6 7 8 |
4 | #include "zmq.h" | ^~~~~~~ compilation terminated. error: command 'gcc' failed with exit status 1 Failed with default libzmq, trying again with /usr/local {'libraries': ['zmq'], 'include_dirs': ['/usr/local/include'], 'library_dirs': ['/usr/local/lib/amd64', '/usr/local/lib'], 'runtime_library_dirs': ['/usr/local/lib/amd64', '/usr/local/lib'], 'extra_link_args': ['-m64']} |
The simplest solution I’ve found to this is just to install libzmq. We can pull the latest code from github and install it. I usually install things in /opt but as the pip process will look for libzmq in /usr/local, we’ll install it in there this time.
1 2 3 4 5 6 7 8 |
pkg install git libtool git clone https://github.com/zeromq/libzmq.git /export/libzmq cd /export/libzmq ./autogen.sh ./configure --prefix=/usr/local MAKE=gmake gmake gmake install pip install jupyterlab |
If you do put it somewhere other than /usr/local you might need to let pkg-config know. E.g.:
1 |
export PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/opt/libzmq/lib/pkgconfig/ |
Tested on a SPARC-M7 T7-1 kernel zone running 11.4.24.0.1.75.0
When trying to install scipy via pip on Solaris you may encounter this error:
1 |
numpy.distutils.system_info.NotFoundError: No lapack/blas resources found. |
There’s a simple way to resolve this but it’s not immediately obvious. BLAS (Basic Linear Algebra Subprograms) and LAPACK (Linear Algebra Something Something…?) are actually provided by the Solaris math and perf libraries, so installing those and setting a couple of environment variables will get scipy and numpy up and running for us on Solaris.
1 2 3 4 5 6 7 8 9 10 11 |
pkg install sunperf system/library/math virtualenv gcc system/header virtualenv-3.7 /export/venv . /export/venv/bin/activate BLAS=/usr/lib/libsunperf.so; export BLAS LAPACK=/usr/lib/libsunmath.so ; export LAPACK pip install numpy scipy |
Python 3.7 is in more SRU’s. It’s best practice to use virtualenv to create a virtual environment to work on rather than installing into the system site-packages. I’ve seen a few users make a mess of the packaging system by updating the system python libraries with pip so please just don’t do it 🙂
Tested on a SPARC-M7 T7-1 kernel zone running 11.4.24.0.1.75.0
Another little snippet that I sometimes forget 🙂
Suppose you have a dataframe with a column that has data in a string format and you need to transform that into a way that a machine learning algorithm can use. One good way is with one-hot encoding which will take the values in the column and create new columns with 1’s & 0’s representing the original data.
Have a look at this dataframe snippet:
prod | rev | |
---|---|---|
0 | Solaris 11.3 | 2 |
1 | Solaris 11.3 | 1 |
2 | Solaris 11.4 | 4 |
3 | Solaris 11.4 | 5 |
4 | Solaris 11.3 | 1 |
One-hot encoding can be use to transform that:
1 2 3 |
OH = pd.DataFrame(OH_encoder.fit_transform(mydf[['prod']])) display(OH.head()) pd.concat([mydf, OH], axis=1) |
prod | rev | 0 | 1 | 2 | 3 | |
---|---|---|---|---|---|---|
0 | Solaris 11.3 | 2 | 0.0 | 1.0 | 0.0 | 0.0 |
1 | Solaris 11.3 | 1 | 0.0 | 1.0 | 0.0 | 0.0 |
2 | Solaris 11.4 | 4 | 0.0 | 0.0 | 1.0 | 0.0 |
3 | Solaris 11.4 | 5 | 0.0 | 0.0 | 1.0 | 0.0 |
4 | Solaris 11.3 | 1 | 0.0 | 1.0 | 0.0 | 0.0 |
And that’s fine for machine learning. Your models and pipelines will just handle it once it added to the original dataframe. But what if you actually want to poke around in the dataframe and use the data yourself? Then it would be really useful to have the label of the column reflect what the data in it actually is. No problem just get the feature names from the encoder and rename the columns on the one-hot dataframe before concating it to the original frame:
1 2 3 4 |
column_name = OH.get_feature_names(['prod']) OH_cols_train.columns=column_name df3 = pd.concat([df2, OH_cols_train], axis=1) |
prod | rev | prod_Solaris 11.1 | prod_Solaris 11.3 | prod_Solaris 11.4 | |
---|---|---|---|---|---|
0 | Solaris 11.3 | 2 | 0.0 | 1.0 | 0.0 |
1 | Solaris 11.3 | 1 | 0.0 | 1.0 | 0.0 |
2 | Solaris 11.4 | 4 | 0.0 | 0.0 | 1.0 |
3 | Solaris 11.4 | 5 | 0.0 | 0.0 | 1.0 |
This turned into a surprisingly tricky project… I have a signalink as my main audio interface box to my radios. It’s basically a USB sound card that connects to the data port of my radios and lets me use digital modes. It also has a couple of knobs to control levels. It’s great, I love it, but as it costs about €130 I wasn’t going to bu another one just for the convenience of not having to move cables around when I want to use it on a different radio.
When querying objects in django the simplest way is to use the filter() query.
If we have a set of systems for example and we want to filter out those that have a SPARC architecture we could do:
1 |
>>> System.objects.filter(architecture="sparc") |
And if we wanted to narrow that down to only sparc machines that use NIS for name resolution we could do:
1 |
System.objects.filter(architecture="sparc").filter(name_resolution="NIS") |
But that is narrowing down the selection. It’s an AND. What were asking for is systems that are a SPARC _AND_ use NIS. And although we can use exclude() to reject some results it’s still and AND operation. What about an OR query?
What if we wanted to select systems that had either a T5 or a T7 processor? Then we need Q queries.
Q queries take the same form of argument as filter() & exclude() etc. but can be combined using OR operators. Here’s the Q query for our model that looks at the cpu_implementation for T5 or T7:
1 |
Q(sysconfig__icontains="T7") | Q(sysconfig__icontains="t5") |
To build our query we just use this Q expression in a filter:
1 2 |
from django.db.models import Q System.objects.filter(Q(sysconfig__icontains="T7")|Q(sysconfig__icontains="t5")) |
From time to time the International Space station transmits Slow Scan TV (SSTV) images as it passes overhead. These are relatively simple to receive and decode and sometimes you can get a downloadable certificate for decoding them.
An evolving notes-to-self type post…
The ISS has two digipeaters. One on 145.825 MHz and another on 437.550 MHz. The 2m being easier to use, as you don’t really need to account for doppler is the most popular. Though the 70cm was in widespread use when the 2m went offline a couple of years back.
Continue reading APRS with the International Space Station (ISS)While messing about with WSJT-X I noticed that it recommends setting the the mode to Data/Pkt if the radio supports it. My radio, a Yaesu FT-450D, does. But it didn’t work. Thus began a couple of hours down the rabbit hole of serial connections…
Continue reading Setting the DATA mode in Omnirig to use USER-USB for Yaesu Radios