Hash collision in Apple NeuralHash model - https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issues/1
@stman @theruran @natecull @vertigo @Kooda I absolutely agree with that, though. I think most important conclusions, at least in my mode of thought personally, are reached by stepping back from the formalisms that more explicitly define one field and taking a general perspective on patternicity across multiple conventionally separate domains, looking for connections & emergent conclusions. That and ignoring received wisdom about necessary complexities, which are often actually incidental.
@theruran Hello James. Ok I have my mastodon archive, but the JSON file is very huge and grep'ing it doesn't work as there are no line feed separators, I'd need a tool to dig and search easily the file, what do you recommand me as long as the linux text editors I tried crash because the file it too big. Is there a file editor that works directly on the file and doesn't load anything into RAM ? I dunno ?
@natecull Hi. Please can you have a look to the last post I've just sent to you on Krali, I need you to develop something you wrote. Thanks. Fred.
@syntacticsugarglider I just mean that computation and other real-world processes are durative and event-driven. If reductions are constant-time, then how much time exactly for each? because instantaneous processes are mathematical fictions, am I right?
like @natecull mentioned earlier, the world is a mess of data and we need a meta-language to deal with it all the most reliably. as said in M. Hamilton's presentations, we need to develop ultrareliable libraries/subsystems so that (the development of) systems can gradually benefit from library of proven specifications+implementations. in other words, once we write a formally-verified JSON specification+implementation, we should never need another one - assuming we are targeting some abstract machine. and then we can get on with developing real-world systems instead of playing in the sandbox.
I think Mastodon has made a deliberate architectural decision to make searching timelines hard-to-impossible, out of a belief that easy searching of social media is equivalent to organized mass harrassment and therefore is behaviour that the platform must automatically prevent.
I don't really understand this argument myself, and I think it mades Mastodon-like systems incredibly inefficient for a lot of group work.
@stman which author are you thinking of? there were some revisions of situation calculus done decades later than McCarthy's seminal paper.
@syntacticsugarglider so you have basically translated this Symmetric Interaction Combinators into a Rust implementation with reflection that compiles to interaction nets and called it Welkin? It can only model and execute deterministic computations? What about non-deterministic, is it just unwanted? Interaction nets do look like a very convenient formalism but as @stman indicates, can the time dimension be added to these formal systems without breaking them, and while adding the ability to model real-time systems?
Sorry if you have already answered these questions. I am just now getting thru all the wonderful messages.
@theruran @stman @natecull @vertigo I've mentioned this a few times but I've been away from the project for a good while due to mental health issues and have only resumed work in the past few days. I've already fixed all the remaining correctness issues with the evaluators, but there are some subtle correctness issues with the typechecker that I need to resolve before fleshing out the crucial reflection primitives (though that is *fairly* trivial) and building an interactive environment
@theruran @stman @natecull @vertigo right now the "first-class evaluator" to the extent there is one is a CPU-side inet evaluator, and there's also an on-GPU inet evaluator and a conventional CPU beta-reduction evaluator. the GPU inet evaluator is definitely by far the fastest but its VRAM strategy is incorrect and for terms that expand a lot before converging it will silently fail, which is easy to fix but there are other things that are more important at the moment.
@theruran @stman @natecull @vertigo The language records more than enough information about data sizing and structure to generate native threaded code or some such without garbage collection or even complex heap management routines like a conventional memory allocator. I haven't implemented any sort of native target other than a scheme backend since it's simply not a priority at the moment, but I eventually want full self-hosting with native codegen in welkin for some subsets.
@theruran @stman @natecull @vertigo for dealing with hard real-time applications I'm less interested in expanding the inet evaluator to handle that, since it's intrinsically very parallel and doesn't have any sort of priority queue or fairness system, which is actually quite desirable. However, if you need to characterize latency etc. then compiling some terms in a single-source environment to native instructions is entirely feasible.
@theruran @stman @natecull @vertigo Various sorts of process calculi for temporally evolving systems can easily be modelled, though in the shorter-term I'm most concerned with implementing logic programming tools to assist in manual unification of constraints in the editor & query over semantic environments. For highly stochastic systems I've recently begun to learn about probabilistic programming, and some sort of probability monad with an MCMC backend is in the eventual future.
@theruran @stman @natecull @vertigo with regard to determinism/non-determinism, I treat welkin as a model of *computation* itself, not as a model of all that is computable, and further semantics can be embedded therein, since it is highly expressive. It's an abstract machine, not a language, and I intend to target it with a higher-level interactive programming environment that will provide non-special-cased primitives for other paradigms.
@theruran @stman @natecull @vertigo ah so
the symmetric interaction combinators are a form of interaction net, interaction nets being a more general formalism for a specific sort of graph reduction system based on active-pair "redexes" to track points where reduction can occur in the network
Welkin is a dependent-typed functional programming language based on a restricted lambda calculus and compiles to these combinators, and the current reference implementation is in Rust
Researcher on IC Security Flows, P2P & Mesh Networks, FPGA & Microprocessors Architectures.
A bunch of technomancers in the fediverse. Keep it fairly clean please. This arcology is for all who wash up upon it's digital shore.