Category Archives: Tech

Compiling OpenCV on mac OS 10.13.6 with CUDA

Just a little hint for myself:

cmake -DCMAKE_BUILD_TYPE=Release \
-DOPENCV_EXTRA_MODULES_PATH=<path to opencv contrib repo>/modules \
<path to opencv source>

Important things here:

  • OPENCV_CUDA_FORCE_BUILTIN_CMAKE_MODULE – is required, due to some weird staff in cmake (cmake internal FindCUDA.cmake seems to be wrong), this comment on opencv github by alalek was extemely usefull and provided us with right hint.
  • WITH_PROTOBUF is also required, otherwise it fails with crazy libprotobuf linker errors.
  • OPENCV_EXTRA_SHARED_LINKER_FLAGS=”-lomp” If you’re using clang 10 or older, you’ll need to to add “-lomp” to linker flags manually, because it ignores “-fopenmp” flag which cmake passes to clang driver in this case. Look’s like it falls just here:

  • Rest of flags looks pretty obvious
Please follow and like us:

Simple script to create Mac OS installation .iso media


It is supposed, that you downloaded it with App Store, and it is stored as .app in your /Applications directory.

Below is script. The only parameter you have to customize is a “DIST” variable, which should be your distributive name (“Mojave”, “Catalina” and so on).

This article is based on instructions from

As a result you should get .iso file on your desktop with name of your distributive (e.g. “Catalina.iso”).


INSTALL_APP="Install macOS $"
MEDIA_VOL="Install macOS $DIST"

echo Create dmg... &&
hdiutil create -o $DMG_FILE -size 8500m -volname $VOL_NAME \
       -layout SPUD -fs HFS+J &&

echo Attach dmg... &&
hdiutil attach $DMG_FILE_EXT -noverify -mountpoint $VOL_NAME &&

echo Create install media... &&
sudo /Applications/"$INSTALL_APP"/Contents/Resources/createinstallmedia \
    --volume $VOL_NAME --nointeraction &&

echo Detach $MEDIA_VOL... &&
hdiutil detach /Volumes/"$MEDIA_VOL" &&

echo Convert $DMG_FILE_EXT -\> $CDR_FILE &&
hdiutil convert $DMG_FILE_EXT -format UDTO -o $CDR_FILE &&

echo Rename $CDR_FILE -\> $ISO_FILE &&

echo Cleanup... &&
Please follow and like us:

Anti make install

What to do if you accidentally installed some experimental project to your system folder? Are here any ways to undo install script. Here is a simple solution.


below may work or may not, this is all given as-is, you and only you are responsible person in case of some damage, data loss and so on. But I hope things go smooth!

To undo `make install` I would do (and I did) this:

Idea: check whatever script installs and undo this with simple bash script.

1. Reconfigure your build dir to install to some custom dir. I usually do this: `–prefix=$PWD/install`. For CMake, you can go to your build dir, open CMakeCache.txt, and fix CMAKE_INSTALL_PREFIX value.
2. Install project to custom directory (just run `make install` again).
3. Now we push from assumption, that `make install` script installs into custom dir just same contents you want to remove from somewhere else (usually `/usr/local`). So, we need a script.
3.1. Script should compare custom dir, with dir you want clean. I use this:


echo "Remove files from $RM_DIR, which are present in $PRESENT_DIR"

pushd $RM_DIR

for fn in `find . -iname '*'`; do
# echo "Checking $PRESENT_DIR/$fn..."
if test -f "$PRESENT_DIR/$fn"; then
# First try this, and check whether things go plain
echo "rm $RM_DIR/$fn"

# Then uncomment this, (but, check twice it works good to you).
# rm $RM_DIR/$fn


3.2. Now just run this script (it will go dry-run)

bash <dir you want to clean> <custom installation dir>

E.g. You wan’t to clean /usr/local, and your custom installation dir is /user/me/, then it would be

bash /usr/local /user/me/

3.3. Check log carefully, if commands are good to you, uncomment `rm $RM_DIR/$fn` and run it again. But stop! Did you really check carefully? May be check again?

Good luck!

Please follow and like us:

C++ sql-like Select example (imperfect)

I just would like to keep it here…

May be there is better implementation? Spent on it 30 mins, have no more time today.

// select.cpp

#include <vector>
#include <iostream>

      typename OutputCollectionT,
      typename OutputItemT = typename OutputCollectionT::value_type
  class Select {
          typename InputCollectionT,
          typename InputItemT = typename InputCollectionT::value_type
      class From {
          static OutputCollectionT Do(
              const InputCollectionT &Input,
              std::function<OutputItemT(const InputItemT &In)> Selector
          ) {
            OutputCollectionT Out;
            for (const InputItemT &In : Input) {
            return Out;

struct A {
    int P1;
    int P2;
    int P3;

struct B {
    int P1;
    int P3;

int main() {
  std::vector<A> aa = {
      {1, 2, 3},
      {2, 3, 4},
      {3, 4, 5}

  auto bb = Select<std::vector<B>>::From<std::vector<A>>::Do(
      aa, [=] (const A &a) -> B {
        return { a.P1, a.P3 };

  for (auto &b : bb) {
    << "P1:" << b.P1 << ", "
    << "P3:" << b.P3 << "\n";

  return 0;
Please follow and like us:

Edge detection shader for text

Hi there! I’m working on text rendering for my small Bird OSD project.

So I want to add contours to the text, so it could be visible whatever background it is rendered on (bright or dark).

For example I want to enhance text rendering for cases like this:

(Ugh… My eyes suffer!)

Into this one:

Assuming we have 1-component color on input which consists only of alpha channel, I want to mark as edge alpha values around 0.5.

Below is my shader which works, and in fact above are screenshots with its demonstraction. It still has some limits though. Edge radius  can’t take values ended with .5 due to special rounding case for N*0.5 values. If you use it, and find more issues, please let me know.

// The inpute textures
uniform sampler2D uTexture;
varying vec2 vTexCoord;  // Interpolated texture coordinate per fragment.

uniform float uOpacity;

uniform float uWidth;

uniform float uHeight;

// If foreground value is higher than threshold, than edge is zero for this pixel
const float NO_EDGE_TRESHOLD = 0.5;

// Edge radius, works fine in range from 0.6 to 2.0
// Please don't use N*0.5 values, since it has special rounding rules
// and pixel at the left may be in is not the same distance comparing to the right.
const float EDGE_RADIUS = 1.;

const vec3 EDGE = vec3(0., 0., 0.);

// Detects whether we should put edge value in the center.
// Not that if the center is foreground value = 1, then there is
// no need in edge
// (in practice we also admit some values below 1,
// determined by threshold).
// We work with fonts, not the regular image, so
// edge is a function from average of two pixels (not the difference):
// f(l, r) = edge((l + r) / 2)
//    assuming edge should be max, when input value is "k" (belongs to range (0, 1) )
// So, how 'edge' function is defined?
// (see picture if formulaes are difficult)
//    edge(v) = (1./k) * x, if x <= k && x > 0
//              otherwise it is line which goes through p1[x,y] = [1, k] and p2[x,y] = [0, 1]
//  Y ^
//    | p[y=1,x=k]
//    |  /\
//    | /  \
//    |/    \
//    ----------->
//    0  k  1    X
//    y=edge(x) formulae
// in case when k = 0.5 then
// f(l, r) = 1 - |l + r - 1|
// Let's use this case!
// params:
//    left - left foreground value
//    center - center foreground value
//    right - right foreground value
// returns:
//    edge value.
float getEdge(float left, float center, float right) {
    if (center > NO_EDGE_TRESHOLD)
        return 0.;

    if (center > left && center > right)
        return 0.;

    float ledge = 1. - abs(left + center - 1.);
    float redge = 1. - abs(right + center - 1.);

    return max(ledge, redge);

float getNeighbour(float row, float col) {
    float dx = EDGE_RADIUS / uWidth;
    float dy = EDGE_RADIUS / uHeight;

    float texX = clamp(vTexCoord.x + col * dx, 0. + dx/2., 1. - dx/2.);
    float texY = clamp(vTexCoord.y + row * dy, 0. + dx/2., 1. - dx/2.);

    return texture2D(uTexture, vec2(texX, texY)).a;

float calcEdge(float centerValue) {
    // Nighbour pixels:
    // neighbour[i][j] is neighbour with X = x + (j-1) * DX; Y = y + (i-1) * DY;
    // neighbour[0][0] is neighbour with X = x - DX; Y = y - DY;
    float neighbour_0[3];
    float neighbour_1[3];
    float neighbour_2[3];

    for (int j = 0; j != 3; ++j)
        neighbour_0[j] = getNeighbour(-1., float(j-1));

    for (int j = 0; j != 3; ++j)
        neighbour_1[j] = getNeighbour(0., float(j-1));

    for (int j = 0; j != 3; ++j)
        neighbour_2[j] = getNeighbour(1., float(j-1));

    float horEdge = getEdge(neighbour_1[0], centerValue, neighbour_1[2]);
    float vertEdge = getEdge(neighbour_0[1], centerValue, neighbour_2[1]);
    float ltrbEdge = getEdge(neighbour_0[0], centerValue, neighbour_2[2]);
    float rtlbEdge = getEdge(neighbour_0[2], centerValue, neighbour_2[0]);

    return max( max(horEdge, vertEdge), max(ltrbEdge, rtlbEdge) );

vec4 calcFinalValue(vec3 foreground, float foregroundValue, float edgeValue) {
#if 1
    float sumFgEdge = foregroundValue + edgeValue;
    vec3 color = vec3(
        foreground * foregroundValue / sumFgEdge +
        EDGE * edgeValue / sumFgEdge);

    return vec4(color.r, color.g, color.b, min(sumFgEdge, 1. * uOpacity));

    //return vec4(EDGE, edgeValue);
    return vec4(foreground.r, foreground.g, foreground.b, foregroundValue);

// The entry point for our fragment shader.
void main()
    vec4 texColor = texture2D(uTexture, vTexCoord);

    float foregroundValue = texColor.a;
    float edgeValue = calcEdge(foregroundValue);

    gl_FragColor = calcFinalValue(
        vec3(texColor.r, texColor.g, texColor.b),
Please follow and like us:

The Bird Project

Bird OSD


You have Raspberry Camera and you need FPV, but you can’t fight 100-200ms latency? Then there is a solution.

Bird OSD turns your Raspberry PI into FPV stream source with OSD overlay.


Since raspberry has Video Composite Output, you can then cast raspberrian screen just like a regular FPV signal over FPV transmitter module!

Raspberry Pi works on broadcomm SoC  with VideoCore processor so that means we can apply OSD overlay to camera stream with really low realtime latencies.

X server is not requried

Bird OSD is a systemd service, it uses raspivid app to grab camera image, and it uses own bird-osd GLES2 application to apply overlay with sensor data on it.

So finally you should see something like this:

(GPS was broken, sorry, still can’t demonstrate in real fly)

Another pic from FPV goggles:


  1. RPI device with sensors board (navio2 is ok)
  2. Raspberry Camera connected to it.
  3. Something sending MAVLink data to ardupilot, arducopter, whatever)

How to install

Download .deb package onto your raspberry device:

$ wget

And then install it:

$ sudo dpkg -i bird-osd_1.1.2_armhf.deb

Then you should target MAVLink channel to

E.g. for arducopter:

$ sudo nano /etc/default/arducopter 

Ensure you have string like this:

TELEM1="-A udp:"

Or like this:

TELEM2="-C udp:"

In case you modified /etc/default/arducopterconfig, then you should restart service:

$ sudo systemctl restart arducopter

Finally you should start bird-osdservice with this command:

sudo systemctl start bird-osd

Then on monitor connected to your raspberryyou should see whatever your camera sees + overlay with sensors data!

It is still very first version:

  1. I only tested it on RPI 3, I added dependency to raspividand to bash:
    libraspberrypi-bin (>= 1.20180417-1), bash (>= 4.4-5)

    Perhaps dependency versions are higher then it really needs, just had no opportunity to test it on another envs.

  2. Do not to blame me guys for not opening sources. There are such a mess, need to sort them first.
  3. It still consumes too much of CPU time. After holidays I’ll work a bit on optimizations. It uses text atlas, but still builds text layout dynamically. It should render every static text to texture; per profiling survey results, it should improve performance on 30-40% (since most of text labels are static).
  4. Any proposals are welcome.

How enable or disable service

If you want to enable bird-osdon boot, you should run:

$ sudo systemctl enable bird-osd

This command disables service:

$ sudo systemctl disable bird-osd

How to uninstall

And this command removes bird-osdfrom you raspberry device:

$ dpkg -P bird-osd

Relevant topics

Edge detection for text – simple edge detection shader for text-like foreground drawings

Please follow and like us:

Artificial Neural Networks and Puffer Jacket

Let’s imagine that humanity died out, and new civilzation discovered that human beings used puffer jackets to save body heat.

But they didn’t know how it works. So they “disassembled” one of jackets  they found, and discovered that it consists of bird fluff.

And this is how they reproduced heat saver thing. They made a building with walls out of fluffs cemented with clay. And got little progress. So they tried to use glair instead of clay, but still too bad.

Only 128 years later, when they advanced on heat theory, they understood what was wrong, and made building with hollow walls, but filled with flair.

A bit later knowing theory and learned chemistry and gas dynamics they stopped to kill birds, and replaced flair with synthetic heat insulation fiber.

Well, if we advance on how our mind works, we perhaps find a better use of artificial neural networks, or may be replace it with better concept?

Please follow and like us: