"Hi I'm new here at this makerspace. Oh, what am I making? Friends, hopefully."

Show thread

Open hardware laptop MNT Reform now available in both DIY (starting at €899!) and assembled + many customization options in our store: shop.mntmn.com/products/mnt-re

#TIL When a Class 2 ceramic capacitor is heated above its Curie temperature (130 °C), the crystal structure changes. Soldering the cap causes a sudden increase of capacitance beyond spec (so don't attempt to hand-matching ceramic caps!), followed by a logarithmic decline (aging). This is reversible, resoldering the cap or baking the board can "reset" an aged cap back to the initial state.

"Referee time" is the time for a cap's capacitance to decline back to its standard spec value (X7R cap ages slowly, referee time is 1,000 h., X5R is only 48 h.). A DC bias not just reduces the effective capacitance, but further accelerates aging in a nonlinear way.

As usual, Class 1 NP0/C0G capacitors are not affected by these effects. #electronics

What if we started making scroll bars that were like easily visible again

RT @hondanhon@twitter.com

Square using the headphone jack on the iPhone as input for a mag stripe reader means in an alternate universe we could’ve had iPhone software distributed on cassette tape. twitter.com/rstevens/status/13

🐦🔗: twitter.com/hondanhon/status/1

fedimeta 

I’ve been dragged into some really pointless conversations here as of late and I’m wondering if there has been another influx of uses from other social networks into the fediverse?

@requiem I'm not sure you consider web apps (things like Overleaf or Discord on the browser) as applications, but I feel like the big thing about web as an application platform is the fact it runs anywhere. Users are not required to download anything (besides the browser they probably already have), and maybe this something that makes it atractive.

@requiem I've spent much of the last year on that. It's a long story for a toot 🤣

The web was never designed to be an application platform but it got turned into one anyway, and as a result it’s very bad at both what it is and what it was supposed to be.

This makes me wonder: what if we deliberately designed an application platform that is well made and incorporates whatever made the web attractive as an application platform (and discard what is not).

The first step is to identify why the web was so misused, I have some ideas but I’m sure they are incomplete.

Reading about logo and the memories it brings back about programming the Apple II makes me notice a difference between these early machines and the more terminal-oriented systems like Unix and Linux and to some degree even the PC. These other machines had a video mode that while mostly used for text, integrated text and bitmap graphics somewhat seamlessly.

Makes me think twice about what I want out of a video interface for my

Weirdly drawn to logo today, and it’s reminding me of forth.

@requiem @thegibson If I may be allowed to be pedantic here, I ask that my words be considered with some gravity.

The issue isn't static logic. The issue is divorcing instruction decoding from instruction set design to attain performance goals not originally built into the ISA.

It takes, for example, several clock cycles just to decode x86 instructions into a form that can then be readily executed. Several clocks to load the code cache. Several clocks to translate what's in the code cache into a pre-decoded form in the pre-decode cache. Several clocks to load a pre-decode line into the instruction registers (yes, plural) of the instruction fetch unit. A clock to pass that onto the first of (I think?) three instruction decode stages in the core. Three more clocks after that, you finally have a fully decoded instruction that the remainder of the pipelines (yes, plural) can potentially execute.

Of course, I say potentially because there's register renaming happening, there's delays caused by waiting for available instruction execution units to become available in the first place, there's waiting for result buses to become uncontested, ...

The only reason all this abhorrent latency is obscured is because the CPU literally has hundreds of instructions in flight at any given time. Gone are the days when it was a technical achievement that the Pentium had 2 concurrently running instructions. Today, our CPUs, have literally hundreds.

(Consider: a 7-pipe superscalar processor with 23 pipeline stages, assuming no other micro-architectural features to enhance performance, still offers 23*7=161 in-flight instructions, assuming you have some other means of keeping those pipes filled.)

This is why CPU vendors no longer put cycle counts next to their instructions anymore. Instructions are pre-decoded into short programs, and it's those programs (strings of "micro-ops", hence micro-op caches, et. al.) which are executed by the core on a more primitive level.

Make no mistake: the x86 instruction set architecture we all love to hate today has been shambling undead zombie for decades now. RISC definitely won, which is why every x86-compatible processor has been built on top of RISC cores since the early 00s, if not earlier. Intel just doesn't want everyone to know it because the ISA is such a cash cow these days. Kind of like how the USA is really a nation whose official measurement system is the SI system, but we continue to use imperial units because we have official definitions that maps one to the other.

Oh, but don't think that RISC is immune from this either. It makes my blood boil when people say, "RISC-V|ARM|MIPS|POWER is immune."

No, it's not. Neither is MIPS, neither is ARM, neither is POWER. If your processor has any form of speculative execution and depends on caches for maintaining instruction throughputs, which is to say literally all architectures on the planet since the Pentium-Pro demonstrated its performance advantages over the PowerPC 601, you will be susceptible to SPECTRE. Full stop. That's laws of physics talking, not Intel or IBM.

Whether it's implemented as a sea-of-gates in some off-brand ASIC or if it's an FPGA, or you're using the latest nanometer-scale process node by the most expensive fab house on the planet, it won't matter -- SPECTRE is an artifact of the micro-architecture used by the processor. It has nothing whatsoever to do with the ISA. It has everything to do with performance-at-all-costs, gotta-keep-them-pipes-full mentality that drives all of today's design requirements.

I will put the soapbox back in the closet now. Sorry.

So hackers, what's your recommendation these days O/S wise?

Just realized that I now have a laptop that might boot Haiku directly and have a working network interface…

I know what I’m doing tomorrow morning.

Signal created targeted ads for Instagram that show the personal data that Facebook collects about you and sells access to.

They were blocked.

signal.org/blog/the-instagram-

I think modern JavaScript standards and theology have a lot in common in that they are both trying to take arbitrary collections of strange folk traditions and trying to derive some kind of grand unified philosophical perspective out of them.

Show more
hackers.town

A bunch of technomancers in the fediverse. Keep it fairly clean please. This arcology is for all who wash up upon it's digital shore.