The Power Of Choosing Optimism

"The happiness of your life depends upon the quality of your thoughts." - Marcus Aurelius

My first memories of childhood are grey, not in the cool, noir style of 1920s movies, but in the bleak, post-communist greyness of 1990s Romania. I was surrounded by people who had hoped for freedom for so long that once they finally attained it, they had no idea what to do with it. It was a time when everyone fended for themselves, and everything seemed dire. Outlooks were bleak, opportunities were scarce, and optimism was rarely displayed or witnessed.

In the face of life's many challenges and uncertainties, it can be all too easy to slip into negativity, cynicism, and despair. The struggles we face, both big and small, can wear us down over time if we let them. However, I've found that one of the most powerful tools we have for living a happy and fulfilling life is also one of the simplest: choosing optimism.

Optimism is not about ignoring life's difficulties or pretending everything is perfect. Rather, it's about maintaining a positive outlook and hopeful attitude even in hard times. It's believing that we have the strength to overcome adversity, that better days lie ahead, and that our efforts can make a difference.

When we choose to be optimistic, several things happen:

  1. We become more resilient. Optimism gives us the grit to persevere through setbacks. Instead of being defeated by obstacles, we trust in our ability to find a way forward.
  2. We notice more opportunities. An optimistic mindset helps us see possibilities where a pessimist sees only dead-ends. Optimists tend to be better at finding creative solutions.
  3. We attract positive people and situations. Optimism is contagious. When we emit a hopeful, positive energy, we naturally draw more of the same into our lives.
  4. We feel happier and more fulfilled. Perhaps most importantly, choosing to be optimistic just plain feels good. It elevates our mood, reduces stress and anxiety, and reminds us to be grateful for everything we have.

Optimism is like a muscle - the more we practice it, the stronger it gets. By making the intentional choice each day to be hopeful, positive and solution-focused, even in small ways, we build our capacity to handle whatever life sends our way.

In a world that can often feel dark and negative, choosing optimism is a revolutionary act. It allows the light of hope and possibility to shine through. And that is a truly powerful thing, especially for someone like me who grew up in a time and place where optimism was in short supply. By consciously embracing optimism, we can break free from the shadows of the past and create a brighter future for ourselves and those around us.

ZSH Alias Last Command and Alias + Persist commands

Introduction

If you find yourself constantly running the same long commands on your terminal, setting up quick aliases can be a game-changer. This blog post will walk you through creating two handy ZSH functions that allow you to define and save aliases on-the-fly. With these, you can alias any command, including the last command you ran, without even leaving the terminal!


1. Create Aliases On-the-Fly

First up, let's talk about how you can instantly create aliases for any given command.

Code Snippet:

add_alias() {
  alias_name="$1"
  alias_command="$2"
  aliases_file="$HOME/.zsh_aliases"

  # Create alias and save to the current session
  alias "$alias_name=$alias_command"

  # Append the alias to zsh_aliases file for persistence
  echo "alias $alias_name=\"$alias_command\"" >> "$aliases_file"

  # Source the zsh_aliases file to apply changes
  source "$aliases_file"

  echo "Alias $alias_name added."
}

 

Usage:

Place this function in your .zshrc file or another file that's sourced by it. Then add source $HOME/.zsh_aliases in your .zshrc to load these aliases in future sessions.

add_alias ll "ls -la"

Now, ll will be an alias for ls -la, and it will also be saved in your .zsh_aliases file for future use.


2. Alias the Last-Run Command

Ever ran a command and immediately realised you want to alias it? This function has got you covered.

Code Snippet:

# Obviously feel free to name this command whatever you like. Same for the one above.
alias_last_command() {
  alias_name="$1"
  last_command=$(fc -ln -1)
  last_command="${last_command##*( )}"
  aliases_file="$HOME/.zsh_aliases"

  if [ -z "$alias_name" ]; then
    echo "Alias name is required."
    return 1
  fi

  # Create the alias and save to the current session
  alias "$alias_name=$last_command"

  # Append the alias to .zsh_aliases file for persistence
  echo "alias $alias_name=\"$last_command\"" >> "$aliases_file"

  # Source the .zsh_aliases file to apply changes
  source "$aliases_file"

  echo "Alias $alias_name for '$last_command' added."
}

Usage:

Just like the previous function, add this to your .zshrc and make sure you're sourcing the .zsh_aliases file.

To create an alias for the last command you ran:

alias_last_command ll

This will make ll an alias for whatever your last command was, saving it in .zsh_aliases for future sessions.


Conclusion

With these two ZSH functions, you can quickly and conveniently set up aliases right from your terminal, making your workflow much more efficient. Give them a try and watch your productivity soar!

Happy coding! 🚀

Introducing goqueuelite: Golang + SQLite queue

It finally happened! I am about to introduce my first proper open source project, it is called squeuelite and it is a Golang package that tries to fix the queue issue using SQLite only.

The package can be found out github.com/risico/goqueuelite, check it out. The package is not production ready yet, although I’ve been using it already in maile.io (from where I’ve extracted it) and is able to handle quite a bit of load (I’ll post benchmarks soon).

Getting started

Import the library into your project as usual:

package main

import (
	"github.com/risico/goequeuelite"
)

func main() {
	s, err : squeuelite.New(squeuelite.Params{
		DatabasePath: "queue.db",
	})

	err := s.Put("default", "somedata", 0) // put "somedata" into the "default" namespace, execute as soon as possible
  // handle err
	
	v, err := s.Pop("default")
	// handle err

	// do something with v 

	s.Done(v.MessageID)
	// or 
  s.Fail(v.MessageID)
  // or 
  s.Retry(v.MessageID)
}

The library is safe to run concurrently, so here the main usecase would be to run workers in Go routines that pick jobs to work on. Jobs can be marked as done, failed or be sent back for retry.

The API will def change until v1 is stable so please be prepared for that.

Have fun!

Understanding the Unusual Behavior of Golang's Custom UnmarshalJSON Method with Inner and Outer Struct Fields

Introduction

In this blog post, we will discuss an interesting case in Golang where using a custom `UnmarshalJSON` method on a struct with both inner and outer fields results in only the inner fields being unmarshaled. We will look into why this occurs and suggest two alternative solutions to overcome this issue. Let's start by understanding the problem.

The Problem

Consider the following Go code with a struct named **`Person`** that has inner and outer fields:

 

type Name struct {
    First string `json:"first"`
    Last  string `json:"last"`
}

type Person struct {
    Name
    Age int `json:"age"`
}

Now, we want to implement a custom `UnmarshalJSON` method for the **`Person`** struct:

 

func (p *Person) UnmarshalJSON(data []byte) error {
    type alias Person
    aux := struct {
        *alias
        Age int `json:"age"`
    }{
        alias: (*alias)(p),
    }
    if err := json.Unmarshal(data, &aux); err != nil {
        return err
    }
    p.Age = aux.Age
    return nil
}

The expected behavior is that the custom `UnmarshalJSON` method should unmarshal both inner (Name) and outer (Age) fields. However, it turns out that only the inner fields are unmarshaled, and the outer field (Age) is ignored.

Why This Happens

The issue arises due to the use of the embedded struct Name in the Person struct. When the custom UnmarshalJSON method is called, it tries to unmarshal the JSON data into the embedded struct first. The outer field (Age) is then shadowed by the inner field with the same name in the auxiliary struct, which causes it to be ignored during the unmarshaling process.

Alternatives

To overcome this issue, we have two alternative solutions:

1. Refactor the struct to have no inner fields:

type Person struct {
    FirstName string `json:"first"`
    LastName  string `json:"last"`
    Age       int    `json:"age"`
}

By doing this, we avoid the issue of field shadowing and ensure all fields are unmarshaled correctly.

1. Create a separate UnmarshalJSON method for the inner struct:

func (n *Name) UnmarshalJSON(data []byte) error {
    type alias Name
    aux := &struct {
        *alias
    }{
        alias: (*alias)(n),
    }
    if err := json.Unmarshal(data, aux); err != nil {
        return err
    }
    return nil
}

By creating a separate UnmarshalJSON method for the inner struct (**`Name`**), we ensure that the JSON data is correctly unmarshaled for both the inner and outer fields.

Conclusion

In this blog post, we explored the peculiar behavior of Golang's custom UnmarshalJSON method when used with a struct containing both inner and outer fields. We discovered that only the inner fields are unmarshaled, and the outer fields are ignored due to field shadowing. To resolve this issue, we presented two alternative solutions - either refactor the struct to have no inner fields or create a separate `UnmarshalJSON` method for the inner struct.

git check-ignore

In the world of version control, Git has become an indispensable tool for developers. One of its key features is the ability to selectively ignore certain files or directories with the help of the .gitignore file. This can be a real lifesaver when you need to exclude files that don't belong in your repository, like build artifacts, logs, or user-specific settings. However, sometimes it can be challenging to figure out why a particular file is being ignored. That's where the git check-ignore command comes in handy! In this blog post, we'll explore this powerful yet underutilized Git command and how it can help you understand your .gitignore configuration.

A Quick Overview

git check-ignore is a command that allows you to determine if a file or directory is being ignored by Git due to the rules specified in your .gitignore files. By using this command, you can quickly identify which rule is responsible for ignoring a specific file or directory and make necessary adjustments if needed. The basic syntax of the command is:

git check-ignore [options] <pathspec>...

Let's break down some common use cases and how git check-ignore can help in each scenario.

Identifying the Cause of Ignored Files

Imagine you've just cloned a repository and found out that a file is missing. You're not sure whether it's been deliberately ignored or accidentally excluded. To find the reason, simply run:

git check-ignore -v <file_path>

The -v (verbose) option shows not only the ignored file but also the exact .gitignore rule and the file responsible for that rule. If the file is not ignored, the command will produce no output.

Checking Multiple Files

You can also check multiple files or directories at once by providing multiple <pathspec> arguments:

git check-ignore -v <file_path1> <file_path2> <file_path3>

Debugging .gitignore Rules

Another use case for git check-ignore is to test and debug your .gitignore rules. This can be especially helpful when you're dealing with complex exclusion patterns. By running git check-ignore on different file paths, you can verify if your rules are working as intended.

Tips and Tricks

  • Use the -no-index option to temporarily ignore any changes to your .gitignore files and check the default behavior without modifying your project:

    git check-ignore --no-index -v <file_path>
  • If you're using multiple .gitignore files (e.g., one per directory), remember that git check-ignore considers all applicable rules in the order they appear, from top to bottom. This is crucial when working with negation patterns (e.g., !important.log).

Conclusion

git check-ignore is a powerful command that helps developers understand and manage their .gitignore configurations. By leveraging this command, you can quickly identify why specific files or directories are being ignored, debug your .gitignore rules, and ensure that your repository stays clean and efficient. So, the next time you're struggling with ignored files, don't forget to use git check-ignore to shed some light on the mystery!

Lessons in cybersecurity, Part I

Here's a little story from the trenches, from far far away when I was a kiddo learning my way through webservers, PHP and vulnerable (pirated) bulletin boards software.

👋🏻 Intro

A long time ago around 200* something, I was really interested in game hacking related topics and somehow I got in charge of a big forum in the niche. That was my first intro to PHP and anything web related and while being quite good in C++/assembly and having some knowledge of how to breach software security I had stumbled into a fresh new world.

The forum software we were using at the time was an old, pirated version of vBulletin (it came before my reign to power 😅), sitting on a Dreamhost shared server using PHP 4 something. I was writing code by downloading the files I needed to edit via FTP, do my changes in Notepad++ and then I would upload them back, and so on a dozen times until it worked. This all happened while users were online. The good times of the mono dev-testing-production environments <3

☕ The morning wake-up call

The actual story begins one morning where I woke up like any other day, I get on at my desk and check my beloved community as I was doing everyday. But strangely this time it was different, a black webpage appeared, with red text and a funky “hacker” image. I checked the URL again, it was correct ... panic!

I quickly jump to my trusty FTP client and open the website’s root folder. I notice that all the files are there still, it seemed that just the index.php file has been replaced. And because I knew no better, I simply replaced the index.php with one I had in backups.

Crisis averted, everything fixed. I even tried my best to change the passwords around and I even upgraded the forum software to the latest version (that I could find), time to move on and see about my day, thinking everything is well and life is nice again. Narrator voice: things weren’t well and life would not be nice for long.

🕛 Next morning, deja-vu

Same as the day before, same routine, same hacked website, same defaced homepage. 😭 I was having a deja-vu, I thought I fixed it, how could it be? Well, it be. I go through the same steps as the day before, I upload my index.php back and go with my day.

This weird dance continued for a couple of days.

🛡️ The guardian cron job

Finally getting fed up with doing this whole dance every morning I come up with my brilliant solution (or so I thought at the time). I thought I should automate this and so I created a script that checked the hash of the index.php and compare it to a well known hash of the original file, if the hashes would differ, then it would copy and replace the whole forum software with the “good one”. I would throw that in a cronjob where it would run my checker script every couple of minutes.

It worked! The homepage never got defaced again, well, I never saw it get defaced again 😅.

Thinking in retrospective, there’s more that I could’ve done, a lot more, but at the same time I was limited by the tools at hand and more importantly, my knowledge. Given the website was on a shared hosting, there was not much under my control but I could have at least looked into the root cause a bit more, investigate and try to understand what happened, before applying my brute-force fix.

Overall this was a good lesson and a good start into cybersecurity, it was the spark that lit my curiosity for this field.

!https://c.tenor.com/tm3KA5yrnmMAAAAC/hacker-man-hacker.gif

GBrowse selected lines and copy to clipboard

When talking with colleagues over chat about certain parts of the code it’s very helpful to show and give context as quickly as possible.

Here’s a way to quickly copy a link to GitHub (or GitLab) of the current visual selection in Vim. You need vim-fugitive installed and an upstream provider.

vim-fugitive provides the :GBrowse command while vim-rhubarb or shumphrey/fugitive-gitlab.vim know how to handle the upstream provider.

After installing those plugins, using the :GBrowse command should open the current file in Github/Gitlab. In order to make it work for visual selection of lines, the following can be used.

:'<,'>GBrowse!

Also to make things simpler and faster, I have it re-mapped to gb as follows:

vnoremap gb :'<,'>GBrowse!<CR>

That’s it!