Hacker Newsnew | past | comments | ask | show | jobs | submit | stoppingin's commentslogin

Does Fortran need to? Does Fortran still have any real advantage over Julia, MATLAB, etc.?

Please forgive my ignorance. I've never written any Fortran. I understand how and where it's used in modern computing though. I also understand why it's faster than C in some cases.


Does Fortran still have any real advantage over Julia, MATLAB, etc.?

Performance. Fortran is much still faster than Julia, MATLAB etc.

It's also much easier to write fast Fortran compared to fast C. While C code written by an HPC expert will almost always be as fast as Fortran written by an expert, C written by "mediocre" C programmer who is a domain expert solving a problem in the most obvious way, will basically always be slower than Fortran code written be an equally "mediocre" Fortran programmer.


Fortran isn't faster than Julia. most comparisons I've seen are ties or Julia wins as long as both implementations are somewhat competent.

Fortran actually makes it fairly hard to write fast code since it is missing some features. for example, I don't believe there is any way to write a fortran program using Bfloat16 numbers. you also don't have great ability to write programs with a mix of strict ieee semantics and fastmath semantics. you have to choose 1 as a compile time flag.


If you are ignoring the cost of the compiler, and a whole host of other things - sure. But the same can be said for most any modern programming language. A lot of Julia's public benchmarks are not idiomatic Julia or packages were created to elide how nonidiomatic they are. Julia isn't a slouch after precompilation, but the time to burn in code can be longer than the runtime and the compilation time of the code in other languages by orders of magnitude. It's great for academic benchmarks though! Huge pain for CI and development.


Are you not talking about Julia v1.9? Packages precompile to binaries (.so/.dll) in this release, and will support direct calls pretty soon. It sounds like you're talking about a much older Juila.

> A lot of Julia's public benchmarks are not idiomatic Julia or packages were created to elide how nonidiomatic they are.

??? https://docs.sciml.ai/SciMLBenchmarksOutput/stable/MultiLang... this is pretty standard code.

I do enjoy programming in Fortran but let's at least keep it concrete and to reality. The older Fortran versions do have a small amount of optimizations that are hard to perform in other languages because the lack of aliasing can make difficult to prove optimizations possible. But the newer Fortran versions don't optimize as well without forcing things like ivdep, which is similar to Julia, which is why you tend to get the same/similar machine code between LFortran and Julia (since both are using the same compiler, LLVM).


If by much older you mean the current stable Julia release than sure... As of today Julia 1.8.5 is the stable release. So this post is traditional Julia community stuff.. "it works and everything is perfect you're the problem. We didn't create our own message board and stop posting in other places to try to control the narrative". Gas light city.

I wasn't referring to package benchmarks, my apologies if that was unclear.


I'm glad I'm not the only one that feels this way. I love Julia as a language (not as an implementation, which is a pain-in-the-ass to work with), but there has long been a pattern of responses which in practice amount to gaslighting. Afaik it's only small set of people who do that, and charitably they're likely just being less-than-thoughtful about generalizing their "work for my project", "works on my 24 core computer", "works when I know half the people developing packages" experiences to everyone. But to an outsider, it looks like the answer is always "the problems are all fixed now, you're just out of date", and most of the time that turns out not to be true.


More than half the community feels that way the other half are either sycophants or don't care. It's not like it's one person either, it's the language maintainers general attitude. It's been called out in the past but it sure hasn't changed because there's a product to sell.


Your benchmarks are not concrete and realistic. I have never seen any language but (unsurprisingly) Julia, to compare its performance with the performance of other languages by calling those languages from within the host language. Definitely 15000 times faster than MATLAB as claimed by JuliaComputing can be achieved with such seemingly concrete and realistic benchmarks. LLVM has benefited and learned so much from Fortran and its compilers. But free food and service are always undervalued.


> Definitely 15000 times faster than MATLAB as claimed by JuliaComputing can be achieved with such seemingly concrete and realistic benchmarks.

I believe you're talking about NASA Launch Services engineers claiming Julia's ModelingToolkit simulations outperformed Simulink by 15,000x? That claim was of course not made by Julia Computing or anyone affiliated by Julia Computing, which is pretty clear because the person who makes the claim very clearly describes his affiliation at the beginning of the video. The source is here: https://www.youtube.com/watch?v=tQpqsmwlfY0, at 12:55. You did watch the whole video to understand the application and the caveats etc. instead of just reading the headline and immediately coming to a conclusion, right?


The benchmarks on Julia's website are ones that take around 1 second. The couple nanoseconds of call overhead doesn't matter.


It is not just about call overhead. It is about a whole suite of aggressive optimizations only possible for a whole program. Point to one person or entity in the world who calls SUNDIALS Julia wrapper to bind their C production code to SUNDIALS. If you cannot, you have two options: 1. make your Julia benchmarks concrete and realistic or 2. cease and desist from pointless advocacy of your employer (JuliaComputing) and its benchmarks in public forums.


Sure you can keep moving goal posts. Of course it doesn't make sense to bind a C production code to a C package (SUNDIALS) through Julia. But if you're asking who is using Julia bindings to SUNDIALS as part of a real case, one case that comes to mind is the Sienna power systems dynamics stuff out of NREL (https://www.nrel.gov/analysis/sienna.html). If you look inside of the dynamics part of Sienna you can clearly see IDA being used (https://github.com/NREL-Sienna/PowerSimulationsDynamics.jl). IIRC at a recent Julia meetup in the Benelux region kite model simulations also used it for the same reasons (https://github.com/aenarete/KiteSimulators.jl) which of course is pointing to the open source code organization for Aenarete (http://aenarete.eu/).

The way to find other use cases is to look through the citations. Generally there will be a pattern to it. For cases which reduce to (mass matrix) ODEs FBDF generally (but not always) outperforms CVODE's BDF these days, so those cases have mostly converted over to using the pure Julia solvers. This includes not just ODEs but also other DAEs which are defined through ModelingToolkit, as the index reduction process generates ODEs and generally the ODE form ends up more efficient than using the original DAE form (though not always of course). It's in the fully implicit DAE form that the documentation (as of May 1st 2023, starting somewhere back in 2017 according to the historical docs) recommends using Sundials' IDA as the most efficient method for that case (https://docs.sciml.ai/DiffEqDocs/stable/solvers/dae_solve/) (yes, the docs recommend non-Julia solvers when appropriate. There's more than a few of such recommendations in the documentation). Power systems is such a case with Index-1 DAEs written in the fully implicit form which are difficult in many instances to write in mass matrix form and not already written in ModelingToolkit, hence its use of IDA here. By the same reasoning you can also search around in the citations for other use cases of IDA.


Our discussion will continue as long as one side believes fair, realistic benchmarks are merely moving the goalposts. Your benchmark has a severe fundamental flaw, especially given the tiny reported runtimes. I hope you realize and fix it before other critics (perhaps more credible than an unknown forum contributor) begin to question your programming knowledge or fairness. To address the matter, you must compile/write whole programs in each of the respective languages to enable full compiler/interpreter optimizations. If you use special routines (BLAS/LAPACK, ...), use them everywhere as the respective community does. Apples to Apples.


> before other critics (...) begin to question your programming knowledge

You must be a truly special kind of stupid to be questioning Chris' knowledge.


Typical Julia programmer/employee response when pointed to their flaws. Arrogant, rude, Young and inexperienced, irrational.


What about the other benchmarks on the same site? https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Bio/BCR/ BCR takes about a hundred seconds and is pretty indicative of systems biological models, coming from 1122 ODEs with 24388 terms that describe a stiff chemical reaction network modeling the BCR signaling network from Barua et al. Or the discrete diffusion models https://docs.sciml.ai/SciMLBenchmarksOutput/stable/Jumps/Dif... which are the justification behind the claims in https://www.biorxiv.org/content/10.1101/2022.07.30.502135v1 that the O(1) scaling methods scale better than O(log n) scaling for large enough models? There's lots of benchmarks on that site which show things from small to large. And small models do matter too...

> If you use special routines (BLAS/LAPACK, ...), use them everywhere as the respective community does.

It tests with and with BLAS/LAPACK (which isn't always helpful, which of course you'd see from the benchmarks if you read them). One of the key differences of course though is that there are some pure Julia tools like https://github.com/JuliaLinearAlgebra/RecursiveFactorization... which outperform the respective OpenBLAS/MKL equivalent in many scenarios, and that's one noted factor for the performance boost (and is not trivial to wrap into the interface of the other solvers, so it's not done). There are other benchmarks showing that it's not apples to apples and is instead conservative in many cases, for example https://github.com/SciML/SciPyDiffEq.jl#measuring-overhead showing the SciPyDiffEq handling with the Julia JIT optimizations gives a lower overhead than direct SciPy+Numba, so we use the lower overhead numbers in https://docs.sciml.ai/SciMLBenchmarksOutput/stable/MultiLang....

> you must compile/write whole programs in each of the respective languages to enable full compiler/interpreter optimizations

You do realize that a .so has lower overhead to call from a JIT compiled language than from a static compiled language like C because you can optimize away some of the bindings at the runtime right? https://github.com/dyu/ffi-overhead is a measurement of that, and you see LuaJIT and Julia as faster than C and Fortran here. This shouldn't be surprising because it's pretty clear how that works?

I mean yes, someone can always ask for more benchmarks, but now we have a site that's auto updating tons and tons of ODE benchmarks with ODE systems ranging from size 2 to the thousands, with as many things as we can wrap in as many scenarios as we can wrap. And we don't even "win" all of our benchmarks because unlike for you, these benchmarks aren't for winning but for tracking development (somehow for Hacker News folks they ignore the utility part and go straight to language wars...).

If you have a concrete change you think can improve the benchmarks, then please share it at https://github.com/SciML/SciMLBenchmarks.jl. We'll be happy to make and maintain another.


precompile costs are 1 time. if you want to deploy, you build a sysimage and ship that. startup time is then about .5 seconds.


They are one time per instance. Which isn't the same thing as one time. Julia sysimages are huge and take a long time to generate even on decent hardware. Last I checked that whole process was very janky, poorly documented, and under heavy revision(as it had been for years prior).


it's not. once you generate it somewhere you can just copy the files to anywhere that has the same architecture.

sysimages are huge (but they've gotten a decent bit smaller recently). notably, 1.8 added some features that let you make them a bunch smaller for deployment. you can now remove the metadata (i.e source code text) which saves about 20%, and you can also generate it from a Julia launched with -g0 to remove debug info (Julia unlike C includes debug info by default because stack traces are nice). we also recently fixed a really dumb bug that was causing libraries to be duplicated in sysimages, so that will sometimes save a few dozen mb. (who knew that tar duplicates symlinks?)

When did you last check? it's now pretty dejankified and has been for about a year. the docs aren't perfect, but I think they're relatively good.


People have been saying "it's better" for 3 years now. One things for sure, I'm not going to go recheck this again, anytime soon. Anything I'll say at this point will be met with "it's better now" and then I'll have to go find out it's really not in classic Julia fashion. The docs for package compiler were awful for about 2 years. See my above comment and other users jumping as to how annoying it is.

Fun Julia story. I remember one time someone, I believe from Julia computing(iirc) was telling me how much better Julia had gotten at something. They sent me links to academic flag plant repositories that had no code in them. Literally empty packages with no branches even with statement of purposes readmes. I offered to work on it and was met with... Academic competition about how I shouldn't do that because a package already existed for it, and how I should try to work with the author on theirs. Meanwhile I already had code for it, it just never went into the ecosystem. I'm highly unlikely to start investigating Julia again in the short term. Maybe in five years.


The reason people have been saying it's gettting better for 3 years is because it has been. PackageCompiler 1.0 was released in 2020 which made it possible to distribute Julia programs as self contained apps, Julia 1.6 released in 2021 added parallel precompilation which made loading a lot faster. Julia 1.8 was released in 2022 which improved precompilation a bunch, and Julia 1.9 will be released in 1 to 2 weeks and makes precompilation cache native code which significantly improves things again.

Deployment is a fundamentally hard problem for dynamically typed languages. Shipping a Julia .so will probably never be as easy as shipping a .jar file in java. However, Julia has gotten a lot more deployable over the past 3 or so years and work on that front continues. Julia 1.10 already has a bunch of compiler speedups that make things a bunch faster than 1.9 (I expect 1.10 to ship late 2023 or early 2024)


Your parent comment wrote "it's better", you said "it's getting better". This is a common Motte-and-bailey argument in Julia discussions:

The "it's better [now]" is most often given as a response to someone expressing a problem they've had, and in context it's presented in a way that suggests the problem is fixed.

"It's getting better" is a far more reasonable response, if it also comes with a caveat about how much better it's gotten and how usable for purpose it is. A lot of the time Julians seem to conflate between "it's a reliable usable feature" and "a pull request vaguely related has been merged and will be available some time in the future, which fixes maybe 10% of the issue".


apologies if I caused confusion. (personally, I'm not sure where "pretty dejankified" falls on the getting better to better spectrum)


If you reread this comment thread under the guise of "hm how is the Julia community?". I think you'd find it very enlightening. Maybe even find things you could improve. If only that was the goal.


What edge does Fortran really have over GCC C with GNU extensions? If you can use the restroct keyword, are there any speed differences left?


You're escaping the "mediocre programmer" level then.


Neither Julia nor Matlab are great production languages. They are both fine if you just have some math you want to run to get some results. But... Julia changes and breaks core functionality regularly in pretty much everything from the language itself to the packages you use. Matlab although is more stable, has other detriments. It's closed source, it's not really designed to be a systems language, sometimes it's not fast enough, etc.

In my mind, fortran will always be hard to replace especially if there is sizeable legacy code. A lot of people don't realize it, but fortran is kind of like sql. It's old, backed by a lot of theory, and delivered on it's promises for years. That makes challengers job really really hard.


At this point, Julia is being used on ASML's lithography machines. it's pretty deployable.

I'm not sure where you are getting the idea of frequent breaking releases. the language has been pretty stable since 1.0 in 2018 (as in almost all code that worked on 1.0 still works now). (there are some very minor breaking changes in minor releases, but Julia breaks a lot less things per release than python or C).


Cool I am glad there are more people using it in production these days, it will help the language become more stable. I've deployed Julia to production a few times now. In almost every case it was rewritten in another language within a year for one reason or another... This is sad, but, for being a v1 language, it never lost it's "early adopter" experience for myself or my colleagues.

I've seen minor releases in Julia break essential packages. Not like it was one time either. So where did I get the idea, personal hands on experience.

Again it's great if you have a script and want to run it, anything beyond that, in my experience, results in a lot of turmoil and erosion. Almost wonder if it's a flaw in the language itself or, the maintenance model of the packages. Oh well not my problem


> I've seen minor releases in Julia break essential packages.

Any specific examples of this happening in Julia 1.0 or later?


Absolutely, a minor change to Julia base broke the CSV package for weeks. But that's like 1 of hundreds of examples. I don't think non package developers realize how much effort package maintainers and drive-by contributors do to keep that language alive.


Would you happen to remember the specific version, or links to any of the other examples? I ask because I've repeatedly heard that every release is tested against the entirety of the (public) package ecosystem, so I'm curious how these snuck through - whether it was before such a system was put in place (maybe because of these breakages), or whether things still sneak through the tests to such a degree.


I don't remember, maybe check closed issues in their repository. But I think we all know why you're really saying this. To cast doubt on what I'm saying. That's okay, gotta get that $$$ or something.


MATLAB is basically a REPL over a collection of old-school Fortran libraries. So it is far from an alternative to Fortran.

See the HOPL history for the background of MATLAB: https://dl.acm.org/doi/10.1145/3386331

EDIT: fixed paper link


> Does Fortran still have any real advantage over Julia, MATLAB, etc.?

These are incomparable since Fortran is a standardized programming language with multiple implementations, whereas the other two you mentioned are products having single implementations.


The new revisions of Fortran are actually quite nice to work with. I had a month of using fortran in a company I worked in and reworked some Fortran 70 code into a more recent version (Fortran 2000?). It overall felt more high level than plain C. Eventually I also ported the code to Python and Cython and while the cython implementation actually was more performant (arrays were always allocated with fixed sizes in Fortran and in python it was easy to pick just the right sizes), the Fortran implementation was fairly readable


I really, really hate the way this author writes. It's so needlessly verbose, obtuse, and condescending.

I've written Node for a living. Mostly Typescript in recent years. I've encountered multiple codebases where previous developers have used all kinds of novel constructs to make Javascript codebases resemble a purely functional language. I've never seen an example of this where the developer has actually managed to make their codebase more concise, understandable, testable, extensible, or more robust. The usual outcome is a complete birds-nest of spaghetti code that only the original developer could ever understand. These codebases usually never outlive the tenure of the original developer: They're usually thrown out the second another dev even lays eyes on it.

Reading through this article, I don't really see anything that would make my real-world coding job easier. I don't see any constructs that would actually make my code less complicated. Not to mention the elephant in the room that adding thousands of lines of scaffolding code (that only the author understands) so that Javascript supports monads (that the developers asked to maintain the code won't understand) adds so much more surface area for bugs. If you want to write an application in Haskell, just do that instead. At least then the company knows to look for a Haskell developer to maintain the mess you've made.


I completely understand where you are coming from and don't doubt that you've seen some gross things built in the name of "functional Node". I do think that functional JS can be elegant when applied with restraint.

I've really enjoyed creating functional pipelines with Ramda in the past for professional projects. I liked how I could use the Ramda functions to explicitly state in my code what the flow of data was with functions like pipe and converge. It seemed to me that being able to understand the dataflow was easier with this paradigm. I could even create pipelines that would automate away dealing with promises in my pipelines with pipeWith. I would implement the same bits of code in "vanilla" js and with Ramda and Ramda was more concise and easier to read (if you understood how Ramda worked...).

You can see an example of the style that I like here: https://github.com/chughes87/calendarbot. I definitely was more "clever" in parts of that codebase than I let myself be in a professional setting heh.

I successfully onboarded a different team onto one of my projects when I was being switched to a different product at my company. An engineer who later did some maintenance work on it told me that the codebase was simple and easy to work with. I did get complaints about a later project that I implemented with Ramda from a person who was totally uninitiated and didn't bother to ask me for help..

I made a lil presentation about Ramda that explains some of this with some graphics that I think are helpful: https://docs.google.com/presentation/d/1tmre_8qJP-QhakXbiBpZ...


I got the impression that the reason the author chose JavaScript was not necessarily with the intention that people would go full-bore FP with it, but more as a way of "meeting people where they are"; JavaScript is widespread and many people are already familiar with it, so it's more likely that they can get up and running with the examples and start exploring the concepts more quickly than with, say, "Learn You A Haskell For Great Good" (where, at the very least, a lot of people would need to download and install the compiler).

And, if the author is successful, then the chances that you could one day be able to write JavaScript code in this fashion and expect it to be maintainable would increase!


This is exactly my impression as well and my personal experience. I first felt like I really understood functional programming when I started using Ramda. It was a library that provided me functional tools in a language I was already familiar with and that allowed me to play around freely without friction.


I've never seen an example of this where the developer has actually managed to make their codebase more concise, understandable, testable, extensible, or more robust.

This is almost purely functional but makes no attempt to look like anything other than JavaScript and just about everything is identifiable to profilers and covered by some form of test automation.

https://github.com/prettydiff/share-file-systems


The first bits of code I come across on this codebase isn't functional: https://github.com/prettydiff/share-file-systems/blob/master.... You're modifying state. You're implementing iteration with do..while loops. That doesn't scream functional to me. Iteration is done with things like Array.map, Array.filter, Array.reduce in functional JS. Higher order functions are key. Functional code is declarative. do..while is imperative.


It is functional and imperative. Functional does not mean only declarative. There are even functional programming languages that don’t allow declarative style like Rebol and Red.

Likewise declarative does not necessarily mean functional. The examples you mention are declarative but not explicitly functional.

I get the impression people come to these statements because they read something about functions somewhere once. When you get past the vanity and actually read the code it’s just a bunch of functions and no vanity.


Where did you get that from the article?

From their results section:

> "Interestingly, detergent residue from professional dishwashers demonstrated the remnant of a significant amount of cytotoxic and epithelial barrier–damaging rinse aid remaining on washed and ready-to-use dishware."


Yes, the article (and your parent comment) accepts that this is true for professional dishwashers (not household ones):

> A professional dishwasher completes 1 or 2 wash and rinse cycles using 3.5 L of water per cycle. The detergent and rinse aid are automatically dispensed into the water at a concentration of 1.5 to 4 mL/L and 0.1 to 0.5 mL/L, respectively. At these concentrations, the residual dilution factor after rinse ranges from 1:250 to 1:667 for detergents and 1:2,000 to 1:10,000 for rinse aids.

> Household dishwasher detergents in a normal cup and plate washing program typically consume a total of 12 L of water: 4.8 L during the washing cycle, 3.6 L of water for the intermediate rinse cycle, and 3.6 L of water for the final rinse cycle. Between the washing and rinsing cycles, 200 mL of water remains inside the dishwasher. Accordingly, the dilution factor for one 20-g tablet of detergent is 1:80,000 (w/v).


The detergent residue they are talking about there isn't the detergent from washing, it is the rinse aid. Rinse aids are detergents that cut surface tension to let the water bead off faster.

They found nothing wrong with washing with detergent, just using rinse aid to dry in a commercial setting due to the detergent residue the rinse aid could leave.


> ...permanently blocks Samsung Pay, Secure Folder, a few other Samsung security apps...

Yet another good reason to root your phone. You wouldn't catch me using any of this Samsung shovelware.


From what I could tell apparently a lot of banking apps will also stop working. I never bothered to confirm it


That's a very fun example. However I'm a native English speaker, and I can't imagine actually writing a sentence like that. It's technically grammatically correct, but no one speaks like this. You would say "She must have been watched", alternatively "She had been watched".


Mm, I think it could come up quite naturally as "she must have been being watched" in something like:

  Two detectives are watching security film.

    DETECTIVE 1
  And then from 9:07 to 9:14 she started acting very cautiously, but –

  Detective 1 rewinds the tape by a few seconds and gestures at the screen.

    DETECTIVE 1
  – from this angle you can tell that she's making an effort to not seem suspicious. I wonder why?

    DETECTIVE 2
  She must've been being watched.
I do agree that form "must have had been being —ed" (with "had" in there as well) is an especially rare form, but in context, native English speakers understand what it means!

I wonder how many other constructions there are that native speakers don't really produce but that they still consider grammatical or as not even being unusual.


I have read the “must have had been being” over and over again and as a native English speaker, I still can’t understand what it means. I hesitate to call it ungrammatical, but instead throw down a challenge: can you actually use it?


Sure! Using "had been" rather than "(has) been" indicates that something may no longer be the case, e.g. "she's been well (and still is)" versus "she had been well (but is not necessarily still well)". So in my example

  She must've been being watched.
suggests that, at that point in the security footage, she might still be being watched. If the detective had said

  She must have had been being watched.
it would additionally imply that, at that point in the security footage, she was no longer being watched (but had been). So let me rework the story a bit:

  Two detectives are watching security film.

    DETECTIVE 1
  She's acting ordinarily, but I see fatigue written on her face. I wonder why.

  Detective 1 leans back in their chair and muses.

    DETECTIVE 2
  She must've had been being watched. The state security service monitors high-profile civilians occasionally, and a few days before this she was acting very cautiously, almost as if she was suspicious of something.
(Also, I think I falsely portrayed the universality of such a construction. Although I think some native English speakers would accept it as grammatical, many might not!)


No, this is just wrong. You are conflating two tenses. Must can be present (be) or perfect (have).

In the example above, you would say: “She must have been being watched.”


To my ears, there's a difference!

  I am watched.            (present passive)
  I was watched.           (past passive)
  I have been watched.     (present perfect passive)
  I had been watched.      (past perfect passive)

  I am being watched.      (present progressive passive)
  I was being watched.     (past progressive passive)
  I've been being watched. (present perfect progressive passive)
  I'd been being watched.  (past perfect progressive passive)

  I must be watched.
  I must have been watched.
  I must have been watched.
  I must have had been watched.

  I must be being watched.
  I must've been being watched.
  I must've been being watched.
  I must've had been being watched.
I think is roughly it? Although now I'm beginning to doubt myself, and maybe it is just:

  I must be watched.
  I must have been watched.
  I must have been watched.
  I must have been watched.

  I must be being watched.
  I must've been being watched.
  I must've been being watched.
  I must've been being watched.
or really, maybe it's better to say those forms don't exist:

  I must be watched.
  I must have been watched.
  —
  —

  I must be being watched.
  I must've been being watched.
  —
  —
What's neat is that I can assign meaning to "must've 'd been", but it really teeters on the edge, sometimes sounding strange but acceptable and sometimes sounding simply wrong.

I can't find many examples of it, but there are a few I've found online:

"To qualify for a special enrollment due to a permanent move, you must have had been enrolled in other minimum essential coverage, such as under a job-based health plan, another Marketplace plan, or Medicaid."

"Sales must have had been in the same year as the tax return."

So perhaps it's best to say it's nonstandard but attested.


Oh my. You have got yourself in quite a pickle. “Must have had been” is simply wrong; in your examples, it should always be “must have been”.


> "Must have had been" is simply wrong; in your examples, it should always be "must have been".

As a matter of prescription, maybe, but as a matter of description? I think it's interesting. It is meaningful and it's something that English speakers produce. Here are other examples in the wild:

"To complete the request, the owner of the permit must have had been present at the time the citation was written, present a valid permit to the Tax Collector in person, have a copy of their citation, and pay a $7.50 processing fee to the Tax Collector if approved."

"Lankeshwar must have had been easy to defeat compared to new-age Ravanas."

It can even have a distinct meaning (rather than just being a variation), which is how I would interpret it. Which I think is cool.

It's perfectly fine to say "don't use this construction if you want to be taken seriously by so-and-so", but if its usage happens to be dialectal, for example, I don't think you can say those speakers are using their dialect of English incorrectly.


I think that the human mind is not evolved for, and does not cope well in societies at the scale we now live in. I think there's a certain tipping point where social cohesion begins to break down, and people's psychology begins to shift from participation in a society of their peers, to guarding themselves from a society of potentially dangerous strangers. I think this phenomena has been extrapolated to a large scale. I also think that the society consisting of people who are largely genetically different from yourself has an impact as well on social cohesion, and by extension, on how people behave in society.


>I also think that the society consisting of people who are largely genetically different from yourself has an impact as well on social cohesion, and by extension, on how people behave in society.

Rome existed for longer than the United States has existed without tying the legal definition of a Roman citizen to someone born in Rome (e.g. Paul of Tarsus, aka Saint Paul, was a Roman citizen and a congenital Hebrew from Anatolia), nevermind whether being born in Rome actually meant your parents were themselves from Rome too.

Phenotypical resemblance also never stopped any of the numerous and vicious civil wars and rebellions in Chinese history just as one example, nor did it stop the military aristocracy of Middle Ages Europe from constantly robbing, pillaging, and raping the peasantry and the clergy, so much that the Church had to issue numerous declarations and edicts to reign in the behavior of knights.


I'm not sure what it's called, but I've seen a product which is a database of the time/location of US car license plate sightings. As I understand it, these are OCR'd from a combination of private, and public footage. I wonder if something similar exists for faces, and if some company is performing facial recognition on publicly uploaded footage. It sounds quite paranoid, however we know for a fact that such technology exists, and that there's a motivation for it.


I don't _think_ yet, publicly -- as far as time/location of sighting records. I would assume that national security police forces have it though... perhaps still secretly in the US? It is known that Chinese security police have it.

But facial recognition on public data, yes, there are commercial facial recognition databases, but i dont' think they (yet?) have timestamped geocoded sightings.

> Australia and U.S.-based face biometrics provider VerifyFaces has unveiled its consumer-facing facial recognition service which can be used for background checks. Unlike image-only searches such as PimEyes, VerifyFaces combines facial recognition and text searches.

> From $11, individual users can conduct a search on the company’s website in four ways: by photo or video, name and birthday, phone number, and home address.

https://www.biometricupdate.com/202212/verifyfaces-unveils-f...


Here is a Vice article [1] on how the repo industry leverages a private database from ALPR [2] cameras mounted on cars, businesses, etc. It tracks everyone, not just those delinquent on their payments.

[1] https://www.vice.com/en/article/ne879z/i-tracked-someone-wit... [2] https://drndata.com/repossession/


All of the tow companies have cameras on their trucks so that they can sell this data.


My god, where does the data mining end?


Omniscient beings don't need data mining.


Privacy legislation similar to the GDPR. Getting the definition of 'consent' right is critical. Until then, there is no end.


Not just consent, but also making organizations liable for the data that they do collect with your consent.



Clearview AI is what you're looking for.


I did some obviously poor googling and couldn’t find this? It’s a private product, not an open source DB?


You can make money on this if you have a high traffic area to place the cameras! https://drndata.com/repossession/


I wouldn't be surprised if this already existed and was being quietly sold to law enforcement agencies.


I've been a long time i3wm user, and I couldn't be happier with my workflow. What network managers are people using together with i3? I've been using wicd for a while, however with Debian deprecating Python2 this won't be viable for long. I haven't found another lightweight GUI network manager which works well with i3. This might not be the perfect place to ask, however I don't use Reddit.


Network manager, with nmtui and nm-applet.


Same here. NM worked flawlessly on my Sway setup, which is not i3, but Sway is very similar to i3 anyways.


nm-applet also, works fine


I use iwd and systemd-networkd. I don't use may applet or gui control. With my setup I just and up ignoring network configuration the majority of the time.

Edit: I've found iwd such a huge improvement over past wireless networking solutions. It's a big improvement.


Maybe iwctl would work for you?


Network-manager is the easiest transition


This idea is pure fantasy. This condition of classifying people as NIMBYs if they disagree with you on this topic is incredibly toxic. There are a plethora of totally valid reasons why people would be opposed to endless urban sprawl, consolidation, and population growth.


> plethora of totally valid reasons why people would be opposed to endless urban sprawl, consolidation, and population growth

Totally agree. But the counterpart of that is rising housing costs. If you accept that tradeoff, you aren't a NIMBY. This covers many homeowners. But renters opposed to development while complaining about housing costs are trying to suppress their housing costs by increasing others'. That externalization is textbook NIMBY.


Thank you for taking the time to make a nice reply, and for not taking my criticism of your post personally. I really appreciate that.

I feel like you have a very arbitrary definition of NIMBY. I'm writing this post right now from the 14th floor of a highrise building in Sydney's inner city, in the apartment I rent from a landlord who lives in mainland China. There are still rows of historic terrace houses in the nearby suburbs that have been heritage listed. I'm sure property developers would love to turn these into more highrise apartments. Even though I'm renting, I don't want these to be replaced with endless new apartments. I could list a dozen reasons too. For one, I don't think this would help create a city that people would actually enjoy living in.

I've been told by people who would know that one of Sydney's big problems is that property developers are able to artificially inflate property value by staggering the release of newly developed property onto the market. So more development isn't necessarily going to solve any of this country's problems with property value. It hasn't so far.


> There are still rows of historic terrace houses in the nearby suburbs that have been heritage listed. I'm sure property developers would love to turn these into more highrise apartments. Even though I'm renting, I don't want these to be replaced with endless new apartments. I could list a dozen reasons too.

Funny, I would have the opposite opinion. Just because someone was alive and rich back in 1950 or whenever these cute little houses were built, doesn't give them more of a right to live in that area than others, in my opinion.

Let's face it, most of these cute litte historic houses are probably owned by the same mega-rich investors and CCP party members as your apartment building.

1 person having a garden does not justify 15 families not being able to live there.


> a very arbitrary definition of NIMBY

NIMBY means not in my backyard. As in, I want the benefits of a thing but not its cost. Wanting affordable housing while denying development is NIMBY. Saying no development because you want higher property prices is not NIMBY, it's prohibitionism. (Depending on the environment, it could be reasonable and/or heartless.)

> been told by people who would know that one of Sydney's big problems is that property developers are able to artificially inflate property value by staggering the release of newly developed property onto the market

This is prudent pipeline management. Why would you bid up the cost of materials and labor only to dump the finished product at a loss?

On supply and demand: American house prices are elastic, but over long timelines [1]. In Sydney, dense housing is more elastic than detached housing [2]. The abundance of those historic terrace houses, together with long development approval times, cause the high prices and relative price inelasticity.

[1] https://www.sciencedirect.com/science/article/abs/pii/S00941...

[2] https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-8462....


> This is prudent pipeline management...

Doesn't this contradict your central thesis? If the supply of housing doesn't exceed the demand, the price isn't going to drop. How can urban consolidation actually benefit the renter class if property developers are able to artificially increase the value by restricting supply until mechanisms like immigration cause demand to catch up?


> Doesn't this contradict your central thesis?

No. In development-constrained world, particularly one with long approval timelines, you need to make money on margin. In a less-constrained world, you can bring to force economies of scale and make money in volume.

The thesis is: if you have an anti-development environment, developers will maximise margins. This isn’t a conspiracy and it isn’t artificially increasing value. It’s survival. If ten houses will get built but there is demand for twenty, and everyone pays the same for labour and materials and lobbyists, all those houses will be as high end as the market will bear. You’re competing in getting the right to build; the market is inelastic. If anyone can build twenty or thirty houses without years of approvals, you’re going to prioritise your costs, because there is a chance you don’t sell every single house. You’re competing on price and value; the market is elastic. (You also get a learning curve.)

This is why Sydney has price inelasticity for detached housing. The scarcity is a policy choice.


This sounds intuitively correct with regards to the economics of property development. I'm not a property developer though. I'm not really that concerned about their profits. I'm a young person renting an apartment in a city that is rapidly becoming unaffordable for the average Australian. The point of my post above was: If property developers are legally able to artificially constrain supply to maximise their profits, then how does all this YIMBYism actually benefit me? The main argument I hear for urban consolidation is that increasing supply lowers the cost. If this doesn't actually happen, then what's in it for us again?


> There are a plethora of totally valid reasons why people would be opposed to endless urban sprawl, consolidation, and population growth.

There really aren't, though, in the sense that the costs (both societal and individual) drastically outweigh the benefits. Of course prohibitions on new construction are narrowly beneficial to specific individuals. Who wants some guys starting up new construction at 7AM next door? If you like your quiet little block, then why on Earth would you want it to densify? Somebody else's construction project is little more than an annoyance, after all. If you can ban it, then great!

The problem is that these individual preferences come with enormous costs, both economic and with respect to individual freedom. When weighed against the downsides, those banal individual preferences about densification are no longer compelling.

In short, yes, people do have rational, coherent reasons to oppose growth, but, no, those complaints are not in the end valid.


Everything in your post is just your own personal opinion.

> The problem is that these individual preferences come with enormous costs, both economic and with respect to individual freedom...

What if I don't agree with increasing the population? If I don't want to increase the number of people in the city I'm living in, then why on earth would I want urban consolidation? Does anyone actually enjoy living in a tiny apartment, as opposed to being able to afford a house with a yard? The need for endless population increase is not just some foregone conclusion. Not everyone here is an SWE living in SF, with SF problems, and SF opinions.

> In short, yes, people do have rational, coherent reasons to oppose growth, but, no, those complaints are not in the end valid.

I don't agree with your opinion. Should I just classify all of it as 'invalid'?


> What if I don't agree with increasing the population? If I don't want to increase the number of people in the city I'm living in, then why on earth would I want urban consolidation?

I’m not sure what kind of answer you’re looking for with those “what if” questions. What if you preferred that the human race go extinct? I suppose the answer to all of these “what if” questions is simply that other people will disagree with you and oppose you in various ways.


> Does anyone actually enjoy living in a tiny apartment, as opposed to being able to afford a house with a yard?

In order to answer this question for yourself you need to first accept an iron law of economics: because decisions are not made in a vacuum, there is no such thing as an abstract preference, only constraints and tradeoffs.

My preference is that I have a 10,000 square foot single-family home located on an otherwise empty block of land just south of Central Park. That way I get everything great about single-family living and access to the economic and cultural superpower that is Manhattan.

But, and I mean this technically and in the kindest way possible: literally who gives a shit?

Everything in life is tradeoffs. The NIMBY position is that tradeoffs can be wished away by legislation. But they cannot. It only deranges the situation.


>What if I don't agree with increasing the population? If I don't want to increase the number of people in the city I'm living in, then why on earth would I want urban consolidation?

Society should ignore your whining.


How does someone oppose population growth?

That’s a literal question. What public policy do you endorse that has the ability to significantly reduce the number of people being born?

If you don’t have one you’re just opposed to building enough houses for the people we have.


> How does someone oppose population growth?

In the case of the western world, what population growth? The amount of couples in the first-world having children has slowed to a trickle in the last few decades. This seems to have spooked governments in the western world into enacting policies to encourage immigration, ostensibly to prevent a decline in economic growth coinciding with a shrinking population. The only place where the birth rate is actually increasing is Africa.

My personal theory (which people are free to disagree with) is that the decline in birth rate in the first-world is a reaction to overpopulation. Although I acknowledge that extrapolation from this example is risky, we know that some animals lose their desire to breed when population density increases, and in captivity. Is it so strange that humans could be similar?


> What public policy do you endorse that has the ability to significantly reduce the number of people being born?

I'd endorse free contraception.

Options I'd oppose but others might favor include not requiring insurance to cover fertility treatments, prohibiting IVF entirely, mandatory birth control until marriage, sterilization as punishment for crime, and removing the child tax credit. Plus restricting immigration and increasing deportations which have a similar effect on the people/housing ratio.

(But I'm in favor of more people being born, and building lots more housing)


Increasing the level of education for women. Increasing the access to pre-natal care for women. Decreasing the poverty level generally and specifically for young families. Moving more people out of agricultural work.

Those things are highly correlated with lowered birth rates, though I suspect the last one is probably not applicable to the US.


> I suspect the last one is probably not applicable to the US

None of them are applicable to America [1].

[1] https://www.census.gov/content/dam/Census/library/publicatio...


The birth rate in the US is 1.64 children per woman. It’s already a significantly shrinking population, replacement rate is around 2.1 children per woman. The US only maintains its population through immigration.

How much lower do you want it to be?


The public policy you want to reduce population growth is simply development. There’s a strong negative correlation between HDI and birth rate. The US, Canada and virtually every other developed nation would lose about 20-40% of its population in a single generation were it not for immigration.


> There are a plethora of totally valid reasons why people would be opposed to endless urban sprawl

Density is literally the opposite of sprawl.


> My hypothesis has been that renters stopped voting.

Who are these renters who stopped voting? I can only speak for Australia, however the average age of renters has been rising for decades. Most of Gen Y have come to terms with the fact they will likely never own property in a major city. These people keep voting, but their vote isn't accomplishing anything.

> Density hurts landowners by decreasing property values and landowners vote.

How does density actually hurt landowners? If you own a house in a heavily consolidated urban area your property value will hardly have decreased. Look at the value of houses in Ultimo, Chippendale, Pyrmont, for instance. Also consider that maybe people just don't want to live in a highrise legoland, or think that endless urban consolidation is a good thing for our cities.


> How does density actually hurt landowners? If you own a house in a heavily consolidated urban area your property value will hardly have decreased.

Correct because, and I can only speak for America, those big cities stopped building. San Francisco alone is short about a quarter million homes, like 33% of the existing population. There’s a whole Wikipedia article on it. That’s why it’s so unaffordable.

> Also consider that maybe people just don't want to live in a highrise legoland, or think that endless urban consolidation is a good thing for our cities.

This is a pretty funny position to take. Those cities didn’t get to high rises because nobody wanted to live in them. You know those buildings are full … right? If nobody wanted to live there prices would be way lower and they never would have built denser housing.

Tokyo is a great example of a major metro where supply and demand are roughly equal and they haven’t seen houses increase in price since 1990. Their units cost a bit more than cost of construction. This is the power of federalizing zoning rules so city councils can’t get between you and building a house on your property.


> San Francisco alone is short about a quarter million homes

If they added a quarter million homes, do you think they would still be short another quarter million homes?

Not saying densification is bad. But in some cities the density can keep increasing and not have any drop in demand. Apartment prices in dense cities don’t drop?


> If they added a quarter million homes, do you think they would still be short another quarter million homes?

Not in this market, haha. The number was based on the quantity of houses added vs the quantity of jobs added in SF and the surrounding area. (From [1]: "For example, from 2012 to 2016, the San Francisco metropolitan area added 373,000 new jobs, but permitted only 58,000 new housing units") But over time? Maybe.

> Apartment prices in dense cities don’t drop?

There are very few cities where supply of housing is allowed to meet demand for housing - but as I mentioned, Japan is a great example. Japan hasn't seen an increase in the cost of housing since edit: [1995, not 1990 as I mistakenly claimed in GP post]. [2] Compare to the US. [3] What happens is that they stop going up in price.

[1] https://en.wikipedia.org/wiki/San_Francisco_housing_shortage

[2] https://fred.stlouisfed.org/series/JPNCPIHOUAINMEI

[3] https://fred.stlouisfed.org/series/CPIHOSNS


Yea, also adding more homes or more units will just increase rents and prices anyway, especially with higher interest rates.

At the end of the day a lot of people have this very wrong idea that living in the best weather and in the epicenter of a global city with top-tier jobs, food, etc. will ever be “affordable”. It simply will not be. Ever.


Tokyo has all of those things, and yet. Why do you think supply and demand do not exist in the housing market? Why is this the one market on earth where it won't ever work? If you have as many houses as jobs in the nearby area then it will be more affordable. That's just facts.

Further higher interest rates lead to lower house prices, to an extent (but only to the extent supply exceeds demand) because people buy houses based on monthly mortgage affordability.

People have looked into this. Here are the results. [1]

> The resultant high demand for housing, combined with the lack of supply, (caused by severe restrictions on the building of new housing units) caused dramatic increases in rents and extremely high housing prices.

I'm not saying you'll be able to live in SF for the cost of living in a shack in the corn belt, but there's no reason it'll be "unaffordable forever and there's nothing we can do about it" when that historically wasn't the case and there are other global counter-examples. And a ton of research was done on this.

[1] https://en.wikipedia.org/wiki/San_Francisco_housing_shortage


> Why do you think supply and demand do not exist in the housing market?

Why would you think that I think this? That's really confusing.

I think the issue you have here is that you are thinking in extremely simplistic terms and not really accounting for real supply and demand in this particular market. Instead of looking to Tokyo (which is a generally bad example because Tokyo and Japan are not great generalized models) you should look to New York City to see what will happen if you continue to build housing in the Bay Area. It won't get cheaper, in fact, the more you build the more expensive it'll get for a few of reasons:

  Interest rates
  Housing standards (environment, earthquakes, etc.)
  Developers do not want to build low-margin housing so they'll only build "luxury" housing with cheap, but perceived higher-end finishes so they can charge more rent per sqft
  Lack of available workers
As developers build additional housing they'll only build for higher rental rates, but because there is a near infinite demand for housing in San Francisco and California as a whole, as additional units come onto the market they'll raise the median rent, but it won't make existing housing cheaper, it'll just be the new floor. If you could build a million units or something in a year, you may be able to reduce prices, but construction doesn't work that quickly. Demand far outstrips supply and will continue to do so.

Tokyo is incomparable for a few reasons, but you can start by examining Japanese birth rates, immigration policy, and California's comparable car-only infrastructure.

Historical examples aren't very useful here because historically there were far fewer people, travel from the highly populated east coast to California was long and arduous and the benefits were "unknown", and people had a lot less money and stronger family ties. Trying to do global comparisons is generally suspect as well. But I guess if you want to see what things would look like, New York City, Hong Kong (pre commie China takeover) and the similar extremely high housing costs and density are appropriate versions of the future of the Bay Area real estate market. With that being said, who knows what will happen with remote work and the like, but probably won't have much in the way of price reductions there anyway over the longer term.

If you want less expensive housing you'll have to live somewhere else. That's just how the world works.


I mean here's my responses to each of the four points you raised:

1) Interest rates make existing housing stock less expensive as people are unable to afford a property of the same price based on fixed monthly payments.

2) Housing standards are relatively consistent, and you know Tokyo is on a fault line right?

3) Developers will happily add any kind of unit they think they can sell, just look at what the city of SF is rejecting on a regular basis. But even if you only build luxury housing, that pushes existing wealth out of less fancy units opening them up to lower wealth tiers. New supply is new supply. Period.

4) Lack of available workers factors into cost of construction, but cost of construction is absolutely dwarfed by market prices. If we were close at all to it, we could include (4) but we're just not.

> Tokyo is incomparable for a few reasons, but you can start by examining Japanese birth rates, immigration policy, and California's comparable car-only infrastructure.

This constrains demand, but the point remains that in their market supply and demand meet. Supply and demand can meet by increasing supply or by decreasing demand. Btw, Japan’s immigration policies are actually very lax, but the perception of their culture as unwelcoming to outsiders creates little actual demand. The US immigration policies by contrast are significantly more stringent.

Density and transit are a chicken and egg problem. More density however allows you to invest more in transit.

> If you want less expensive housing you'll have to live somewhere else.

This wasn’t the case historically and there’s no reason to think it can’t be the case in the future except for lack of zoning reform. Anyways if you want to take up the supply/demand thing feel free to take it up with Wikipedia, which cites plenty of sources that provide contrasting examples.

> That's just how the world works.

I disagree, but thanks for your perspective.

I look forward to your treatise on why supply and demand stops working when it’s nice outside ;) Especially since during COVID rents fell massively in SF as supply outstripped demand - even thought the weather remained lovely.


> Interest rates make existing housing stock less expensive as people are unable to afford a property of the same price based on fixed monthly payments.

You are making overly-simplistic assumptions. Interest rates are one factor that can lead to prices going down to balance monthly payments, but you're forgetting that developers have to borrow money as well to build new housing or new condos or apartment units, renovate existing properties, and other things including running their businesses. This causes their IRR to change, and where a fixed monthly income from a new 35 unit apartment for rent or condo building or housing development makes sense at 0% or 1% rates, it may become unattractive at a higher rate or prices and rents have to increase to account for IRR. Don't make the mistake of assuming that higher interest rates simply lead to cheaper housing across the board. Even so, higher rates don't necessarily mean that housing prices go "back" to some historic price, and even then you have to account for various markets. San Francisco is going to be different than Denver, or Miami, or Topeka.

> Housing standards are relatively consistent,

Housing regulations, build rules and quality, size, etc. all these factors are consistent across the US and Tokyo/Japan? If so, that's news to me. I don't recall Ohio where I live having the same environmental reviews (for example) as California, or the need to build to withstand earthquakes. Or are you saying that Tokyo and San Francisco have the same housing standards?

> Developers will happily add any kind of unit they think they can sell, just look at what the city of SF is rejecting on a regular basis. But even if you only build luxury housing, that pushes existing wealth out of less fancy units opening them up to lower wealth tiers.

While it's true that San Francisco in particular may be rejecting units, you are making a mistake by equating "developers will happily add any kind of unit they think they can sell" with "developers will happily add any kind of new unit they think they can sell at an internal IRR based on interest rates and market conditions".

You are also mistaken because you believe that adding "luxury" housing creates a drop in prices for other housing, that does not necessarily follow when there is pent-up demand at the cheaper price points. What you do is just increase the median rent price, not lower it, in markets like San Francisco or New York or other highly desirable areas.

> New supply is new supply. Period.

Overly-simplistic and you can show that this isn't the case by a simple thought experiment where a developer builds nothing but $50,000/month condos or $5mm homes - supply is added but doesn't alleviate housing shortages.

> Lack of available workers factors into cost of construction, but cost of construction is absolutely dwarfed by market prices. If we were close at all to it, we could include (4) but we're just not.

What do you mean "cost of construction is absolutely dwarfed by market prices"? Are you saying that it's cheaper to build new housing and sell it than it is to sell existing housing?

> This constrains demand, but the point remains that in their market supply and demand meet. Supply and demand can meet by increasing supply or by decreasing demand.

Sure but you are oversimplifying things and you can't generalize Tokyo to the US or San Francisco. Do the same thought experiment with Hong Kong, Singapore, or London.

> Btw, Japan’s immigration policies are actually very lax, but the perception of their culture as unwelcoming to outsiders creates little actual demand. The US immigration policies by contrast are significantly more stringent.

I'm not sure this is true anyway, but it doesn't matter if the effect is the same = fewer immigrants. Difficult to compare Japan versus anywhere really so I'm not sure why you continue to do so.

> Density and transit are a chicken and egg problem. More density however allows you to invest more in transit.

Yes but it is still an existing problem which is why I mentioned it. Hard to build new housing when you have mandatory parking minimums and such (I believe but could be wrong that these were removed in California as a whole last year).

> This wasn’t the case historically and there’s no reason to think it can’t be the case in the future except for lack of zoning reform.

By that rationale land in California should be free/extremely cheap because it used to be in 1880.

> I look forward to your dissertation on why supply and demand stops working when it’s nice outside ;)

So climate isn't a factor in housing prices? Why are you taking such extreme positions? "supply and demand stops working" who said that? Certainly not me.

> Especially since during COVID rents fell massively in SF as supply outstripped demand - even thought the weather remained lovely.

I mentioned COVID already but it also primarily applies to office vacancies. San Francisco in particular is still very expensive to live in even despite COVID for obvious reasons - it took a global pandemic and entire shutdown of the city to get rents to drop. Populations ebb and flow, markets go up, they go down, etc.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: