Hacker Newsnew | past | comments | ask | show | jobs | submit | thegrim33's commentslogin

It's been my belief for over 20 years now that dedicated/instrumented roads for autonomous vehicles is the only way autonomous cars will ever be a thing, at mass scale, other than via the invention of true AGI (which I still don't think we're close to). Such roads becoming a thing within the next 6 years though, I'd doubt.


I think there might be a trial stretch of road somewhere in a few years, although surely not widespread. Such a thing feels inevitable to me, though, if we’re going to have self-driving cars at all.


Isn't a major feature of consensus algorithms for them to be tolerant to failures? Even basic algorithms take error handling into account and shouldn't be taken out by a bit flip in any one component.


Yes. To clarify, my understanding of _this_ particular incident was wrong because it was based on reading the report of a previous incident.

But for the 2008 incident I read and linked the report, that was what happened. The ADIRU unit did probably get a SEU event and that should have been mitigated by the design of the ELAC unit. The ELAC unit failed to mitigate it so that's the part that they probaby fixed.


For some reason it took this long to hit me.

If you take as axioms:

1) Countries have major political interest in whether other countries have nuclear reactors

2) Countries are already, at large scale, manipulating discourse across the internet to achieve their political goals

Then of course it follows that any comment thread on a semi-popular or higher site about whether a country should build more nuclear reactors is going to be heavily manipulated by said countries. That's where (most) of the insane people in such threads are probably coming from.

How are we supposed to survive as a civilization with such corrupted channels of communication?


What is, according to you, the political interest?

There are countries that have interest of having gas or oil bought from them. It is not clear if they are pro or against other countries going nuclear: on one hand, nuclear will replace part of their market. On the other hand, lobbying to move towards nuclear may impede progress in replacing gas and oil by renewable (a strategy would be to lobby so that the nuclear project starts and then lobby so that the project stagnates and never delivers).

There are countries that have interest in seeing nuclear adopted because they have a market for the ore extraction or waste processing. There are countries that have interest in seeing nuclear not adopted because they have a market around other generations.

Finally, some countries may want to see their neighbors adopt nuclear: the neighbor will pay all the front bills and take all the risk (economical but also PR, or the cost of educating experts, ...), and if they succeed, they will provide import energy very cheap that can fill the gaps the country did not wanted to invest in.

So it is not clear if there is just one stream of lobbying. The reality is probably that every "sides" does somehow contain manipulative discourse from foreign countries.


Does this apply also to fossil energy threads? Countries have a major political interest whether other countries use fossil energy, to mitigate the climate catastrophe and ramp down fossils use.


I really, really, wish somebody would actually put together a real reliability report. You know, by actually getting hard data on what repairs different models need, how often different models break down, how long different models last, etc. That's how you should rate reliability.

The consumer reports model of just surveying a random collection of people about what they personally think about the reliability of cars is not hard data. They don't collect any data themselves. They just take random people's beliefs as the data. It's also an oroborous, as what they rate as reliable/unreliable one year will then influence what people's beliefs are when they're surveyed the next year about what they believe is reliable.


They did it in Germany : https://www.autoblog.com/news/the-bestselling-tesla-model-y-...

But it's based on German made models


Afaik No, German report just lists passed/flagged TUV. TUV fails you on negligence like not servicing car every year like a good VW German owner.


It does not require that, though obviously a car which is never serviced is more likely to fail.


TUV inspection is all about checking if car is routinely maintained and in optimal working condition. It fails you on things like:

- rusted rotors, Tesla owner wont ever notice anything wrong with brakes

- worn out suspension, Tesla owners are used to harsh ride


All major communication forums on the internet have been mass manipulated/poisoned by countries across the world for well over a decade now. A huge chunk of all internet speech is inauthentic. In my mind, AI videos really don't degrade the situation much further. The internet as a communication medium has already been completely compromised for a long time.


Citation?



"[..] deploying a solar array with photovoltaic cells – something essentially equivalent to what I have on the roof of my house here in Ireland, just in space. It works, but it isn't somehow magically better than installing solar panels on the ground – you don't lose that much power through the atmosphere"

As an armchair layman, this claim intuitively doesn't feel very correct.

Of course AI is far from a trustworthy source, but just using it here to get a rough idea of what it thinks about the issue:

"Ground sites average only a few kWh/m²/day compared to ~32.7 kWh/m²/day of continuous, top-of-atmosphere sunlight." .. "continuous exposure (depending on orbit), no weather, and the ability to use high-efficiency cells — all make space solar far denser in delivered energy per m² of panel."


There were some big opsec fails on reddit a decade ago in the runup to the 2016 election where lots of propaganda accounts were linked together via similarly generated usernames. The big guys already learned basic lessons like that all those years ago and don't make those simple mistakes anymore.


Surely any time their sales has good growth somewhere, you'd be sharing similar stories about that positive news, right? Surely.


Europe is the second biggest EV market (China is the biggest, North America third). Tesla's market share has seen a steady decline in Europe:

https://eu-evs.com/marketShare/ALL/Groups/Line/All-time-by-Q...

At a time when the overall EV market share is growing in Europe:

https://www.acea.auto/pc-registrations/new-car-registrations...


Are Tesla shares doing well in any major market across the world?


Well the point is the manager gets the praise/promotion/etc for reducing costs and supposedly improving performance, and then they bounce and leave the company, moving on to the next place, before the long term effects can be evaluated.


SDL 3.0 introduced their GPU API a year or so ago, which is an abstraction layer on top of vulkan/others, might want to check it out.

Although after writing an entire engine with it, I ended up wanting more control, more perf, and to not be limited by the lowest common denominator limits of the various backends, and just ended up switching back to a Vulkan-based engine.

However, I took a lot of learnings from the SDL GPU code, such as their approach to synchronization, which was a pattern that solved a lot of problems for me in my Vulkan engine, and made things a lot easier/nicer to work with.


I'm working with SDL GPU now, and while it's nice, it hasn't quite cracked the cross platform nut yet. You still need to maintain and load platform-specific shaders for each incompatible ecosystem, or you need a set of "source of truth" HLSL shaders that your build system processes into platform-specific shaders, through a set of disparate tools that you have to download from all over the place, that really should be one tool. I have high hopes for SDL_shadercross to one day become that tool.


I thought shaders just needed to be compiled to spir-v


My comment was specifically about cross-platform. Apple operating systems don't know what spir-v is.


Oh well sure if you're targeting apple as a platform you're gonna have to deal with their special snowflake graphics API


I wish Apple had made a point to support Vulkan. I know about MoltenVK and all that fun stuff, but for a time, there was a graphics API that worked on all of the major platforms: OpenGL.

Vulkan was meant to succeed OpenGL, and despite my annoyances with the API, I still think that it's nice to have an open standard for these things, but now there isn't any graphics API that works on everything.


SDL GPU is extremely disappointing in that it follows the Vulkan 1.0 model of static pipelines and rigid workflows. Using Vulkan 1.3 with a few extensions is actually far more ergonomic beyond a basic "Hello, World" than using SDL GPU.


That might exclude a lot of your user base. For example a big chunk of Android users, or Linux workstation users in enterprise settings who are on older LTS distributions.


SDL GPU doesn't properly support Android anyways due to driver issues, and I doubt anyone's playing games on enterprise workstations.


But sdl is super high level. If you want to do more than pong, you'll hit a wall very quickly.

I just want OpenGL, it was the perfect level of abstraction. I still use it today, both at work and for personal projects.


For what it's worth my experience with Metal was that it was the closest any of the more modern APIs got to OpenGL. It's just stuck on an irrelevant OS. If they made sure you could use it on Windows & Linux I think it'd fill a pretty cool niche.


WebGPU is in many ways closer to Metal than to Vulkan. You can use the API outside of the browser too, especially in Rust.


> WebGPU is in many ways closer to Metal than to Vulkan.

If only that were true for the resource binding model ;) WebGPU BindGroups are a 1:1 mapping to the Vulkan 1.0 binding model, and it's also WebGPU's biggest design wart. Even Vulkan is moving away from that overly rigid model, so we'll probably be stuck with a WebGPU that's more restrictive than required by any of its backend APIs :/


I'll check out WebGPU at some point, I guess. I've written our rendering layer in all of the major APIs (OpenGL, DX12, Vulkan and Metal) and found it very instructive to have all of them to compare at the same time because it really underscored the differences; especially maintaining all of them at the same time. We eventually decided to focus only on DX12, but I think I'll revive this "everything all at once" thing for some side projects.


As someone who has done this since DX7, what you’re looking for is WebGPU either Dawn (Google) or wgpu-native (Firefox). WebGPU works. It’s 99% there across platforms for use.

There’s another wrapper abstraction we all love and use called BGFX that is nice to work with. Slightly higher level than Vulkan or Metal but lower than OpenGL. Works on everything, consoles, fridges, phones, cars, desktops, digital signage.

My own engines have jumped back and forth between WebGPU and BGFX for the last few years.


Personally I'm not interested in the web as a platform. The APIs themselves I'm interested in, but as a target I think the web needs to die for everything that isn't a document.


I never mentioned the web as a target, rather devices. You don’t need a browser, you need a window or a surface to draw on and use C/C++/Rust/C# to write your code.

WebGPU is a standard, not necessarily for the web alone.

At no point does a browser ever enter the picture.

https://eliemichel.github.io/LearnWebGPU/index.html


It sounds like the standard did itself a disfavor by its name, more interesting in how you describe it.


Well, it started off with “all the right intentions” of providing low-level access to the GPU for browsers to expose as an alternative to WebGL (and OpenGL ES like API’s of old).

However, throw a bunch of engineers in a room…

When wgpu got mature enough, they needed a way to expose the rust API for other needs. The C wrapper came. Then for testing and other needs, wgpu-native. I’m not a member of either team so I can’t say why for sure but because of those decisions, we have this powerful abstraction available pretty much on anything that can draw a web page. And since it’s just exposing the buffers and things that Vulkan, Metal, etc are already based on, it’s damned fast.

The added benefit is you get WGSL as your shading language which can translate into any and all the others.

The downsides are it provides NO WINDOW support as that needs to be provided by the platform, i.e. you. Good news is the tests and stuff use glfw and it’s the same setup to get Vulkan working as it is to get WebGPU working. Make window, probe it, make surface/swap chain, start your threads.


The WebGPU spec identifies squarely as a web standard: "WebGPU is an API that exposes the capabilities of GPU hardware for the Web." There are also no mentions of non-web applications.

The It's true that you can use Dawn and wgpu from native code but that's all outside the spec.


There is mention of desktop applications in their getting-started docs; it seems well within the intention of the maintainers to me.

https://eliemichel.github.io/LearnWebGPU/introduction.html

> Yeah, why in the world would I use a web API to develop a desktop application?

> Glad you asked, the short answer is:

    Reasonable level of abstraction

    Good performance

    Cross-platform

    Standard enough

    Future-proof


This is an indie site. Nothing wrong with it but it's not canon.


And yet Electron exists…

The intent and the application are never squarely joined. Yes it’s made for the web. However, it’s an API for graphics. If you need graphics, and you want to run anywhere that a web page could run, it’s a great choice.

If you want to roll your own abstraction over Vulkan, Metal, DX12, Legacy OpenGL, Legacy DX11, Mesa - be my guest.


You mentioned "Google" and Firefox, one of which is a browser. I clarified that I'm not interested in the web as a target, not to dismiss your entire suggestion but rather to clarify that that particular part doesn't interest me.


I like OpenGL ES but the support for compute shaders sucks. I hate transform feedbacks. I am in the process of trying out WebGPU now, but it doesn't have good native support everywhere like OpenGL ES 3 does.


OpenGL is designed-by-committee state-machine crap.

You don't know it yet, but what you really want is DirectX 9/10/11.


Maybe, but Microsoft tech feels icky


SDL GPU, not SDL. The GPU project is a generic wrapper/abstraction on top of the modern GPU APIs, such as Vulkan. You do all the same stuff as you would in a modern GPU API, except in a generic / slightly more accessible way.


In practice SDL is used to abstract away the system-dependent parts required to set up OpenGL.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: