Drafts

Draft and unpublished posts

0 posts

Muck

Steam achievements and progress for Muck - 2.04% complete with 1/49 achievements unlocked.

5 min

sein

Steam achievements and progress for sein - 8.77% complete with 5/57 achievements unlocked.

5 min
https://gist.github.com/YoEight/d19112db56cd8f93835bf2d009d617f7
[1] xrandr is a great cli to manage your windows in a linux distro using x11, which is most of them. The issue is that I can never remember all the flags to the command, and if you are using it with something like a laptop using a dock the names of all the displays tend to change every time you redock. This makes it really hard to make scripts that work right every time. Homepage # [2] Check out the deresmos/xrandr-manager [3] for more details on it. installation # [4] xrander-manager is a python cli application that is simply a nice interface into xrandr. So you must have xrandr already installed, which is generally just there on any x11 window manager, I’ve never had to install it. As with any python cli that is indended to be used as a global/system level cli application I always install them with pipx. This automates the process of creating a virtual environment [5] for xrandr-manager for me, and does not clutter up my system packages with its dependencies that may eventually clash with another that I want to use. # prereqs (xrandr, pipx) pipx install xrandr-manager set main monitor # [6] First if your main display is not set to the correct monitor set your main dis...
jq has some syntax that will sneak up on you with complexity. It looks so good, and so understandable, but everytime I go to use it myself, I don’t get it. ijq is an interactive alternative to jq that gives you and nice repl that you can iterate on queries quickly. paru -Syu ijq Here are some other articles, I decided to link at the time of writing this article. JUT | Read Notebooks in the Terminal [1] Comprehensive guide to creating kedro nodes [2] Kedro - My Data Is Not A Table [3] References: [1]: /jut/ [2]: /kedro-node/ [3]: /kedro-pickle/
cli
I love getting faster in my workflow, something I have recently added in is creating GitHub repos with the cli. I often create little examples of projects, but they just end up on my machine and not anywhere that someone else can see, mostly because it takes more effort to go create a repo. TIL you can create a repo right from the command line and push to it immediately. gh repo create waylonwalker-cli [1] want to see what this repo I created is about? # [2] Check out what I created here. pipx run waylonwalker References: [1]: https://dropper.waylonwalker.com/api/file/3a889b2a-d83f-4f42-a849-1c34b8e6365c.webp [2]: #want-to-see-what-this-repo-i-created-is-about
git
totally guessed at this post’s date I’m still trying to understand this one, but this is how you force a python object to stop atexit. import atexit class Server: def __init__( self, auto_restart: bool = True, directory: Union[str, "Path"] = None, port: int = 8000, ): if directory is None: from markata import Markata m = Markata() directory = m.config["output_dir"] self.directory = directory self.port = find_port(port=port) self.start_server() atexit.register(self.kill) def start_server(self): import subprocess self.cmd = [ "python", "-m", "http.server", str(self.port), "--directory", self.directory, ] self.proc = subprocess.Popen( self.cmd, stderr=subprocess.PIPE, stdout=subprocess.PIPE, ) self.start_time = time.time() def kill(self): self.auto_restart = False self.proc.kill() def __rich__(self) -> Panel: if not self.proc.poll(): return Panel( f"[green]serving on port: [gold1]{self.port} [green]using pid: [gold1]{self.proc.pid} [green]uptime: [gold1]{self.uptime} [green]link: [gold1] http://localhost:{self.port}[/]", border_style="blue", title="server", ) else: if self.auto_restart: self.start_server() return Panel(f"[red]...

Portal

Steam achievements and progress for Portal - 26.67% complete with 4/15 achievements unlocked.

4 min
Whenever you are installing python packages, you should always use a virtual environment. pip makes this easy to follow by adding some configuration to pip. require-virtualenv # [1] Pip is the pacakage tool for python. It installs third-party packages and is configurable. One of the configuration settings that I highly reccommend everyone to add is require-virtualenv. This will stop pip from installing any packages if you have not activated a virtualenv. why # [2] python packages often require many different dependencies, sometimes packages are up to date and sometimes they require different versions of dependencies. If you install everything in one environment its easy to end up with version conflict issues that are really hard to resolve, especially since your system environment cannot easily be restarted. PIPX my one exception # [3] My one exception that I put in my system level packages is pipx. pipx is very handy as it manages virtual environments for you and is intended for command line utilities that would end up in your system env or require you to manually manage virtual environments without it. pip config # [4] Your pip config might be found in either ~/.pip/pi...
I’ve been trying to adopt pyenv for a few months, but have been completely blocked by this issue on one of the main machines I use. Whenever I start up ipython I get the following error. ImportError: No module named '_sqlite3 I talked about why and how to use pyenv along with my first impressions in this post [1] pyenv/issues/678 # [2] According to #678 [3] I need to install libsqlite3-dev on ubuntu to resolve this issue. install libsqlite3-dev # [4] libsqlite3-dev can be installed using apt sudo apt install libsqlite3-dev But wait…. # [5] When I make a fresh env and install ipython I still get the same error and I am still not able to use ipython with pyenv. ImportError: No module named '_sqlite3 re-install python # [6] After having this issue for awhile an coming back to #678 [3] several times I realized that libsqlite3-dev needs to be installed while during install. pyenv install 3.8.13 I think I had tried this several times, but was missing the -y option each time. You gotta read errors like this, I am really good at glossing over them. [7] Let’s never have this issue again. # [8] When you spend months living with little errors like this and finally fix it, it...
Sometimes you have a pretty old branch you are trying to merge into and you are absolutely sure what you have is what you want, and therefore you don’t want to deal with any sort of merge conflicts, you would rather just tell git [1] to use my version and move on. update main # [2] The first step is to make sure your local copy of the branch you are moving into is up to date. git checkout main git pull update your feature branch # [3] It’s also worth updating your feature branch before doing the merge. Maybe you have teammates that have updated the repo, or you popped in a quick change from the web ui. It’s simple and worth checking. git checkout my-feature git pull start the merge # [4] Merge the changes from main into my-feature branch. git merge main Now is where the merge conflict may have started. If you are completely sure that your copy is correct you can --ours, if you are completely sure that main is correct, you can --theirs. git checkout --ours . git merge --continue This will pop open your configured git.core.editor or $EDTIOR. If you have not configured your editor, it will default to vim. Close vim with <escape>:x, accepting the merge message. Now push y...
git
A few of my friends and I all just borked our neovim configs during a plug update, and because none of us were using :PlugSnapshot it was painful to recover from. https://twitter.com/pypeaday/status/1524882893914398732 Lucky for me I did it on a home machine that I only occasionally edit from, so I could still take the snapshot from a working machine before taking the plunge into fixing everying. Why snapshot # [1] Snapshotting ensures that you install the same git [2] sha on every single plugin. This way when you have multiple machines running your same vim config, they are all on the same sha of each plugin, and you dont end up with weird things happening on one machine. And then you get to decide when you are ready to update, rather than when it breaks. - same config everywhere - you control the update - in case of a borked update you have a good working place to rever to Let’s snapshot # [3] Running :PlugSnapshot will generate the following content in a buffer that you can save. I chose to save mine in ~/.config/nvim/snapshot.vim. " Generated by vim-plug " Fri 13 May 2022 08:01:39 PM CDT " :source this file in vim to restore the snapshot " or execute: vim -S snapsh...
vim
I really like the super clean look of no status menus, no url bar, no bookmarks bar, nothing. Don’t get me wrong these things are useful, but honestly they take up screen real estate and I RARELY look at them. What I really want is a toggle hotkey. I found this one from one of DT’s youtube video’s. I can now tap xx and both the status bar at the botton and the address bar at the top disappear. # ~/.config/qutebrowser/config.py config.bind("xb", "config-cycle statusbar.show always never") config.bind("xt", "config-cycle tabs.show always never") config.bind( "xx", "config-cycle statusbar.show always never;; config-cycle tabs.show always never", )
When you first start qutebrowser It will create some config files in your home directory for you, but they will be empty. Config # [1] As far as I know qutebrowser will create this default config out of the box for you, if it doesn’t, then somehow it just appeared for me 😁. ❯ tree ~/.config/qutebrowser /home/waylon/.config/qutebrowser ├── autoconfig.yml ├── bookmarks │   └── urls ├── config.py ├── greasemonkey └── quickmarks 2 directories, 5 files Why convert # [2] You might want to confvert if you are more comfortable with the python config, or if like me you just want config in one place and you are stealing configuration options from others who have thiers in config.py. Convert to py # [3] References: [1]: #config [2]: #why-convert [3]: #convert-to-py
I am often editing my own scripts as I develop them. I want to make a better workflow for working with scripts like this. Currently # [1] Currently I am combining nvim with a which subshell to etit these files like this. for now lets use my todo command as an example nvim `which todo` First pass # [2] On first pass I made a bash function to do exactly what I have been doing. ewhich () {$EDITOR `which "$1"`} The $1 will pass the first input to the which subshell. Now we can edit our todo script like this. ewich todo Note, I use bash functions instead of aliases for things that require input. Final State # [3] This works fine for commands that are files, but not aliases or shell functions. Next I jumped to looking at the output of command -V $1. - if the command is not found, search for a file - if its a builtin, exit - if its an alias, open my ~/.alias file to that line - if its a function, open my ~/.alias file to that line ewhich () { case `command -V $1` in "$1 not found") FILE=`fzf --prompt "$1 not found searching ..." --query $1` [ -z "$FILE" ] && echo "closing" || $EDITOR $FILE;; *"is a shell builtin"*) echo "$1 is a builtin";; *"is an alias"*) $EDITOR...
I am getting ready to do some timeseries analysis on a git [1] repo with python, my first step is to figure out a way to list all of the git commits so that I can analyze each one however I want. The GitPython library made this almost trivial once I realized how. from git import Repo repo = Repo('.') commits = repo.iter_commits() This returns a generator, if you are iterating over them this is likely what you want. commits # <generator object Commit._iter_from_process_or_stream at 0x7f3307584510> The generator will return git.Commit objects with lots of information about each commit such as hexsha, author, commited_datetime, gpgsig, and message. next(commits) # <git.Commit "d125317892d0fab10a36638a2d23356ba25c5621"> References: [1]: /glossary/git/
I was editing some blog posts over ssh, when I ran into this error. gpg was failing to sign my commits. I realized that this was because I could not answer to the desktop keyring over ssh, but had no idea how to fix it. Error # [1] This is the error message I was seeing. gpg failed to sign the data ssh The fix # [2] The fix ended up being pretty simple, but quite a ways down this stack overflow post [3]. This environment variable tells gpg that we are not logged into a desktop and it does not try to use the desktop keyring, and asks to unlog the gpgkey right in the terminal. export GPG_TTY=$(tty) The log in menu # [4] This is what it looks like when it asks for the passphrase. [5] EDIT-another way # [6] So this did not fix the issue on Arch BTW, and I have seen it not work for wsl users either. This did work for me and reported to have worked by a wsl user on a github issue. echo '' | gpg --clearsign This will unlock the gpg key then let you commit. References: [1]: #error [2]: #the-fix [3]: https://stackoverflow.com/questions/41052538/git-error-gpg-failed-to-sign-data/41054093 [4]: #the-log-in-menu [5]: https://images.waylonwalker.com/gpg-passphrase-unlock.png [6]:...
git