Actually had to add BLFC to my Twitter's muted words list for 7 days duration because of the timeline updates. Holy crap, for a while there, it was almost like being subscribed to Twitter's firehose API.

US Supreme Court floats a startling expansion to police immunity from the law. A new opinion suggests that “qualified immunity” could become something much closer to absolute immunity from lawsuits.

This is what an actual totalitarian government looks like in practice.

Note that this doesn't mean that x86 is dead. Very far from it. It just means that my prediction of x86/Arm/RISC-V for data center/desktop/embedded applications is on track to being vindicated.

Show thread


Intel is switching to a foundry business model because they cannot keep up with x86 designs, which tends to be locked to specific manufacturing nodes, and they can't make new nodes cost effectively anymore.

This also confirms my gut feeling on why they are trying to get cozy with Sifive and the RISC-V community. With everyone wanting to use embedded processors that doesn't require sacrificing substantial income to Arm, RISC-V is the perfect vehicle to advance Intel over Arm.

"It’s strange, isn’t it? The ideology of capitalism is that it is a system that generates immense abundance (so much stuff!). But in reality it is a system that relies on the constant production of scarcity.

This conundrum was first noticed back in 1804, and became known as the Lauderdale Paradox. Lauderdale pointed out that the only way to increase “private riches” (basically, GDP) was to reduce what he called “public wealth”, or the commons. To enclose things that were once free so that people have to pay in order to access them. To illustrate, he noted that colonialists would often even burn down trees that produced nuts and fruits so that local inhabitants wouldn’t be able to live off of the natural abundance of the earth, but would be forced to work for wages in order to feed themselves. "

Degrowth: A Call for Radical Abundance

Food, check your Holloween candy meme 

Check your Halloween candy!

I found a copy of "The C++ Programming Language" by Bjarne Stroustrup and "Clean Code" in a Twix bar!

Boost to save a life!

I finally found a reason why I should learn Nim (the programming language). It has complete interop with C++; which means, from a HDL EDA point of view, I could use Nim to interact with Verilator simulations of the hardware.

Hmm.... 🤔​

There are a variety of ways to classify nebulae. One is by the means in which we see them. Emission nebulae, like M16 here, are generally red and seen by the light given off by excited hydrogen atoms. The H is excited by nearby hot stars, causing the red glow.

One thing caught my eye on the "Programming Language Energy Benchmark" report:

Imperative languages use less energy than OO and functional languages.

It doesn't surprise me, 'cause all my life I saw that CPUs are imperative: Read memory from position X, move value read to register, add 1 to register, move value in register to memory and so on.

For other programming paradigms to use less energy, I bet the CPU instruction would have to focus on that paradigm.

Also, one could expect that a compiler could turn all those objects and monads and functors into imperative code, but I guess that's not really simple.

(Also, scripting language are, basically, a CPU emulation layer: They convert the script language into CPU "language", which is then converted in the real thing.)

Subtoot for a commercial product (---) 

If your product offers a mode to use it via the command-line interface, and that CLI has a hard dependency on X11, and I find out where you live ... you better be ready for some serious egging of your home.

The egging will continue until you fix this very obviously untested use-case as well.

I'm serious. I'm tired of your shit.

Python dev 

In recent years I've got a lot of mileage out of Python, but the packaging, building, distribution story is one that has been in constant flux since I've been developing with it. I never feel like I'm doing it right, and the blog post below confirms I have a boatload of code that is apparently doing it wrong. This shifting packaging story is a constant source of friction in this particular ecosystem, one that makes me wish I hadn't started down the Python path.

What's this? Salesforce (nee Heroku) e-mailing me about my account not having 2FA? And, I've not been a Heroku customer for over 6 years? And there's no unsubscribe option in their mass e-mail?

Oh, look, they have a link to a Google Form. And that has a "Report Abuse" option.

It'd sure be a shame if I report them for spam...

Nebula are large clouds of dust and gas floating in space. These are often 100s of lightyears across in size.

Another reminder on anonymity. When Twitter analysed the racist abuse following the Euros this summer, 99% of the accounts they had to suspend were *not* anonymous. 99%.
Anonymity is *not* the problem.

Dragon Player is a free and open source, simple and minimalistic media player for Linux

Install: Deb, RPM adn Snap packages

you can tell lawns are colonialism en force because of the inordinate spending to maintain them — when capitalism otherwise prefers to cut costs everywhere — and when they're mostly just nice to look at (i guess) from the office or kitchen window

Does anyone have any recommendations on books about compilers architecture? I've read good things about Engineering a compiler, by Cooper and Torczon, and also read that the dragon book is a bit outdated.

Boosts welcome.

EDA gripe, FPGAs, Yosys, subtoot, opinion piece 

I recently read a toot that said Yosys and related tools are strictly inferior to commercial, vendor-provided FPGA programming tools.

I can see why they would say this; and to some extent, they would be right. However, I disagree with the totality of their statement. It leads me to believe that they have never truly worked with commercial tools before.

I am biased; my experience lies with Xilinx's tools. That said, I would willingly surrender Xilinx tooling for Yosys and its surrounding ecosystem any day.

First, Xilinx tools are sssssssssslllllllllllooooooooooowwwwwwwwwwwwwww. OMG, are they so very, very slow. To compile my Kestrel-2DX onto a 3SC1000-class FPGA (I forget the specific part now), it took just about five minutes. For Yosys and (then) arachne-pnr to produce a design for the iCE40HX8K took seconds.

Second, I don't have to deal with missing files from a distribution, which has been a literal perennial problem with Xilinx. My most recent experience with missing Xilinx software was, of all things, the license server. Like, yeah, how do you expect me to run your software without a license server? Yosys, et. al. do not need license servers, at the very least. Yes, compiling it every damn time you want to refresh your toolchain is a right pain in my ass (feel free to see my toot history about my frustrations there). But, at least I end up with working software because all the important pieces are there.

Finally, and last but not least, every vendor toolchain supports a different dialect of Verilog. Every. Single. One. Which leads to designs with a crap-ton of `ifdef and other preprocessor symbols sprinkled liberally throughout a design just to tailor a design to a vendor's unique build tools. Yosys does not provide the most updated version of Verilog support, I get it; however, it is consistent across all the targets that it does support.

I could go on, but I feel like I'm kicking a dead horse here.

Suffice it to say, EDA tools suck. Yosys isn't perfect. But despite the lack of glitz, and the lack of design wizards that automates selection and instantiation of vendor-specific modules (many of them quite useful; e.g., PLLs for example), I still prefer using Yosys because it is a more stable, more productive platform in the end.

But, that's just me.

Show older

A bunch of technomancers in the fediverse. Keep it fairly clean please. This arcology is for all who wash up upon it's digital shore.