Absolutely love this selfhosted arc of pewdiepie that is going on right now. It’s crazy to witness now fast he is picking up linux / self hosting, and sounds like soon will be programming. In this one he built a $20k AI beast that crushes gippity with power, speed, proximity, and security. No one to take your data, no latency to the data center, no one else bogging down your prompts, just raw speed. It looks absolutely wild. He implemented RAG and gave it a bunch of data about himself and its able to spit out his wife’s name and phone number in under a second. It writes code at blazing pace. This may be the future that we get over the next few years as things shift towards AI there will be more affordable options, and a larger second hand market for building out these highly capable machines.
Published
All published posts
I’m so glad that python supports method chaining out of the box, very similar to the pipe operator that Jesse mentions here. It makes everything much more readable to follow the flow rather than needing to parse nested funcion calls out(inside()).
I greatly appreciated the wide variety of experienced maintainers of large oss projects. From webdev to desktop application. The most common sentiment here was don’t contribute to open source just to contribute to open source. Bring something meaningful to the project. Find a project you like, look at the discussions/issues for work or start some discussions. If there are no meaningful features that you can add to projects that you use and love, make your own thing. Adam from tailwind really hit on this one several times. He has made tailwind extensible so that you don’t have to contribute to tailwind to get new capabilities, you can probably just extend tailwind with your thing. Its likely that it makes a lot more sense or your use case, and if it turns out that it makes sense for everyone have the discussion about bringing it in. The upside to small oss projects is that you can move at whatever pace you want and break them all you want when the user base is just you. As you move your stuff into tailwind you have to be very careful not to break the massive tailwind user base and you have to bend to the release schedule of tailwind.
...
Rules
It’s so easy to forget low level tech sometimes. Things that are dead simple and just work without a hitch. git is one of those rock solid things thats very easy to remember all that it does, this is a classic use case.
This just works
cd /parent/directory/for/repo git clone ssh://username@server/path/to/repo
In order to recieve you must update the remote to allow recieve.
git config receive.denyCurrentBranch updateInstead
Now you can pull update push.
...
Well done write up about reflecting solar energy back to earth from low orbit space. I did not know this was a thing, apparently it is/isn’t. Solar is a great technology, its largest limitations are that its not consistent. This tech does not fix this problem, what does is efficient long term storage. I’ve seen some crazy ideas going back to my days in school, maybe elementry school. Theres a lot of innovative ways to store potential energy by moving heavy objects uphill whether fluid or solid. The issue is that energy storage at grid scale is HUGE and not efficient enough. Even assuming this idea had any legs at all, it still doesn’t solve the problem of inconsistent power because it still cant go through clouds!
[…
Wild to see the LinkedIn post linked here to see how out of touch this feels. I find it astonishing that they have something so ingrained into gaming culture as twitch, yet build something like Prime Gaming. Maybe I have no idea what Prime gaming is, but it feels like the opposite of ownership. What I get from steam is a sense of ownership. I own the desktop/laptop/handheld, no one cough nintendo cough cough cant remotely disable my device for using it inappropriately. I have a sense of trust with steam that as long as Gabe is alive I own what I paid for and will be able to open up and play anything at any time on any device I want. It might be a $100 dell workstation raised out of the coorporate refurb bin, it might be a high end machine, It could be my 2010 gateway or my 2045 custom build and they are all likely to play a good amount of my library at some level. I still understand that I really own nothing and the moment steam turns off its servers its quite likely that everything is broken, but its by far the best we have. Far from the status quo we are headed towards with subscription and cloud based gaming. If they wanted to...
...
ROASTED
Unfortunately that game uses some of the worst spyware in the industry, it will never work outside of > Windows with secure boot enabled and TPM hardware.
Consider Dota 2 or other mobas by competent developers
Great justification for using the cloud. The infrastructure requirement for signal to be such a great app would be massive for a small team with low budget. The cloud is fantastic at unknown scaling, bursts beyond reasonable capacity to run yourself, getting compute everywhere in the world, and offloading huge infrastructure management costs.
DHH is 100% right that we have gone too far, too many things come out cloud first for services that can be ran locally cough such as your bed cough cough. One week ago when the world came to a hault, I did not bat an eye at these small teams with complex requirements going down with AWS.
Their own products seem quite damning to me. It signals that they cannot themselves become resilient to themselves. It shows how hard this problem is, how much cost in complexity and resources it requires. I’m sure there are fail overs that happened successfully that we will never hear about, critical products with large engineering overhead.
...
I often want to run an s3 sync in an isolated environment, I don’t want to set any environment variables, I don’t want anything secret in my history, and I don’t want to change my dotenv into something that exports variables, I just want s3 sync to work. dotenv run is the tool that I’ve been using for this, and this uv one liner lets it run fully isolated from the project.
uv tool run --from 'python-dotenv[cli]' dotenv run -- uv tool run --from awscli aws s3 sync s3://bucket data
multi-line #
same thing formatted for readability
uv tool run \ --from 'python-dotenv[cli]' \ dotenv run -- \ uv tool run \ --from awscli \ aws s3 sync s3://dropper data
There are probably 10 ways to skin this cat, but this is what I did, if you have a better way let me know, I’ll link you below.
First 3d Printed Threads
Working on an upcoming project that requires some threaded screws. Trying to keep a low budget on this one with as much to come off of the printer as I can. It might become a slant3d portals product if it works out. I always like making test prints for stuff like this especially to see how the feel is off of the printer that is going to print the final product and take much longer. First try was a success.
I started out looking up standard half inch thread pitch and size, but ran out of time to get the exact profile of a half inch bolt, so I will need to fix that later. Th
The print orientation is critical for strength here. This part is a full 1/2: so it should be strong either way, but to make sure we are printing the bolt horizontally to get nice long print layers. To do this we have to give it a bit of a flat spot on the top and bottom. This does not hurt performance, if anything it probably helps give some room for poor tolerances.
Atuin desktop sounds dope AF, tried to install it off the AUR and it was broken for me. Seems early and the dev team is all in on mac. They have an official .deb and .rpm. I’ll have to try again later, maybe the binary will work.
The idea of building out runbooks from my Atuin data sounds dope AF. It sounds like a mix of markdown and executable cells like a jupyter notebook, but not. Really pitching hard to those of us in the system administration, dev ops, SRE space. Having something that you walk through when a system goes down and you are feeling panicked in DR mode sounds relieving.
Cloud is cooked bois. Seriously too much dumb shit relies on the cloud. Too much critical shit relies on single AZ’s. If normies are literally loosing sleep over an AWS outage (queue the Uncle Roger Voice), You’ve Fucked up. It’s wild to even think about a bed relying on the cloud let alone fully stop working when UE-1 goes down. I want to live in a world of opt in FEATURES, things that bring value to a product because it makes it better. Somehow a bed smells suspiciously like a cash grab for a subscription because its cloud connected. And yet for some reason it takes 16GeeeBee’s per month. I don’t own one of these, and I don’t want to. I don’t want a subscription for everything, I want my shit to just work. The future we are headed towards a world that is ever more reliant on a few key clouds. Which is fine. It’s fantastic that small companies can start and scale without owning an infrastructure team. It’s great that they have the ability to give us many nines of reliability. Some things just don’t need the cloud.
#artificialintelligence #hiring | 120 comments on LinkedIn
More Human stuff that’s what we will be doing. Less looking at docs, more architecting (which suspiciously looks like writing docs), more decision making, more explaining. This is a good positive take on AI right now.
...