This is possibly one of the more cursed single sentences I've ever seen in a job posting
-
Asta [AMP]replied to Xandra Granade 🏳️⚧️ last edited by
@[email protected] @[email protected] @[email protected] @[email protected] "no no, C++ is good now, you just need to use C++46!"
-
@xgranade @aud @SnoopJ @glyph you know how in tarot, death represents change?
yeah sometimes in technology a system just gets too large and inflexible and there's only one type of change left that it can really experience ><
(but also see our recent blog post "The Cybernetic Bias", because we do think that bias is in play here)
-
Xandra Granade 🏳️⚧️replied to Asta [AMP] last edited by
@aud @SnoopJ @glyph @ireneista There is Python on Pylons, and I'm pretty sure I've seen one for PHP as well but I forget the name.
-
Asta [AMP]replied to Xandra Granade 🏳️⚧️ last edited by
@[email protected] @[email protected] @[email protected] @[email protected] pylons? not enough minerals for that.
-
Xandra Granade 🏳️⚧️replied to Irenes (many) last edited by
@ireneista @aud @SnoopJ @glyph Yeah, the pointer-aliasing thing is real as fuck. I mean, BLAS and LAPACK are still mostly FORTRAN for a reason.
I have spent far too much of my life getting F2Py working for precisely that reason.
-
@[email protected] @[email protected] @[email protected] @[email protected] huuuuuuh. And of course, the fool represents the beginning, which is basically what everyone thinks of you if you start a new language
-
@ireneista @xgranade @aud @glyph my direct supervisor in grad school wrote F95 like it was a second language, ended up with a parallel track of development of a Runge-Kutta solver that would do the same thing as the "normal" C++ code I was trying to press into the task we needed.
My stance is that FORTRAN is Good, Actually. The design scope was the right size to begin with, and there's just no arguing with what it takes to compete with LAPACK et al.
Fun to write, though? Absolutely not.
-
Xandra Granade 🏳️⚧️replied to SpookJ 👻 last edited by
@SnoopJ @ireneista @aud @glyph If it was zero-based, had a sensible syntax for declaring function parameters, didn't have the weird function/subroutine split, and didn't hard crash if you add print statements inside a function called from a print statement, then I'd likely even agree.
There's a lot to like about FORTRAN, but it just has too many papercuts stemming from being one of the first languages period, imho.
-
SpookJ 👻replied to Xandra Granade 🏳️⚧️ last edited by
@xgranade did you ever run into Forthon?
That was another Grad School Adventure, and my encounter with it was what taught me that Python 2 lets you mutate the values of True, False.
-
Asta [AMP]replied to Xandra Granade 🏳️⚧️ last edited by
@[email protected] @[email protected] @[email protected] @[email protected] sounds like we need to write FORCIS, then, as a... actually nevermind.
Despite not knowing FORTRAN, I have to agree with the assessment that it is good; there is a reason we continue to use FORTRAN libraries wrapped up in other codes.
#chapelLang (geez Audrey shut up about Chapel) used to default to indexing by 1 (it now uses 0 indexing, as it well should) but also would let you define the index however the fuck you wanted. I think it still does. You want -10 to -40? It's yours, baby. Do it. Who can stop you? -
Xandra Granade 🏳️⚧️replied to SpookJ 👻 last edited by
@SnoopJ @ireneista @aud @glyph No, but I was excited about FORTRESS for a long time. Sun's attempt to modernize FORTRAN, and it included so many good ideas, like you could declare which ring and field axioms a data type obeyed, which would then automatically generate unit tests and give the compiler extra assumptions.
Each program also had a canonical, unambiguous transformation to a LaTeX document, making it easy to embed FORTRESS in papers.
-
-
Irenes (many)replied to Xandra Granade 🏳️⚧️ last edited by
-
SpookJ 👻replied to Xandra Granade 🏳️⚧️ last edited by
@xgranade @ireneista @aud @glyph neat, TIL
-
Xandra Granade 🏳️⚧️replied to SpookJ 👻 last edited by
@SnoopJ @ireneista @aud @glyph It got abandoned in very early alpha versions, but it was fucking cool in how it was designed.
-
Xandra Granade 🏳️⚧️replied to Irenes (many) last edited by
@ireneista @aud @SnoopJ @glyph POLYENBY (am I doing this right?)
Anyway, of all things, Visual fucking Basic let you redefine array bases on a compilation-unit level, which created no small amount of bugs.
-
Asta [AMP]replied to Xandra Granade 🏳️⚧️ last edited by
@[email protected] @[email protected] @[email protected] @[email protected] I wish literally anyone except for Oracle had purchased Sun.
-
Xandra Granade 🏳️⚧️replied to Irenes (many) last edited by
@ireneista @SnoopJ @aud @glyph IKR? I keep thinking that really just needs to be in any new language. generalize it so that you can just list parameterized invariants. But that just gets back to hypothesis and similar libraries... the ring and field axioms were even cooler than that for what information they gave the compiler.
-
@SnoopJ @xgranade @aud What I really want is the LLM to translate idiomatic COBOL to idiomatic Java and then the transpiler and some formal methods to prove that the idiomatic Java is identical to the idiomatic COBOL, but that sounds like hard work and real computer science and software engineering so it’s probably not what they’re doing
-
Asta [AMP]replied to Xandra Granade 🏳️⚧️ last edited by
@[email protected] @[email protected] @[email protected] @[email protected] oh Jesus. Chapel just basically exposed the range map to the user, should you wish it, for any individual array. I think you could still compile assuming the default index started at 1 as an option but that was just during the change; I don’t think you could arbitrarily change them all at the compile level. Oooof.