Hacker Newsnew | past | comments | ask | show | jobs | submit | henningo's commentslogin

To be fair, Linux users aren't really completely left behind, there's the web interface (play.spotify.com).

I'd be interested to know how many people use native app vs. web interface. The web interface is not always as snappy as the native apps, but if it bogs down I can just kill the tab and launch it again.


1. I hate web apps for things that run the the background, Music is one of the things that needs a Tray Apps. 2. If it "bogs down" I can kill it by my dropdown terminal and killall, probally much faster than killing the tab

I do not spend all of my time in a browser, so I want a native app.


This is great news for Jupyter (IPython) Notebooks: https://plot.ly/python/offline/

It basically provides interactive inline visualizations, making it great for data exploration (without having to send any data to plot.ly)

I'm wondering if the success of Continuum (Anaconda) could've influence Plot.ly?


I have tried both, Bokeh (without the server) and plot.js (their python lib offers a JSON encode option so that you can use their Python lib with plotly.js without sending data to plotly).

However, I found both to be very similar and still don't have a clear favorite. Any opinions on which solution is better suited for (internal) interactive dashboards?


One thing I've been waiting for (and if I had the time, I'd love to work on something like this) is scientific javascript. I love es6 and classless oop[0] that javascript provides (and general dynamic-ness) but the lack of operator overloading makes mathy infix operators work with an array-like type not possible in native js. Something that had the "functional", dynamic semantics of javascript with the natural notation that numpy has (with its broadcasting rules) would be fantastic.

Stuff like plotly is great, but as far as I can tell, I'm not going to be able to use it any time soon because I'm not going to write a >10 term expression in javascript when every '+' becomes a 'add(a,b)' or 'a.add(b)'.

[0] https://youtu.be/bo36MrBfTk4?t=33m45s


To do that you first need to have a way of extending JavaScript with C and/or Fortran. No way numpy can be implemented in pure js or python without being excruciatingly slow.


You can do it with the nodejs FFI. It makes me cringe a bit to say that, but you could do it in roughly the same way as how NumPy does it.


Indeed! I think the success of Bokeh (http://bokeh.pydata.org/) from Continuum has influenced their decision...


Thanks for the link! Have you found a way to use the plots offline in your own server (e.g. in Django)? I have just found integration with Jupyter....


I think it is a bit unfair to just blame the manufacturers. The legislation could likely be changed to make for a more robust test approach. After all, it is engineers we are talking about here, if they are given a task to optimize an engine's emissions and fuel consumption to a specific drive cycle they will do it. But then of course you have the ethical aspects which I won't go into.

A computer science analogy to this would be if I gave you an uncompressed image and you had to develop a compression algorithm that made the image as small as possible, you could likely come up with a really good solution. But your algorithm most likely wouldn't do as well on any other image.

I think that a feasible solution to this will be to test under a wider range of conditions and add (statistically defined) noise to the testing procedures. The added cost of additional testing would be very small in comparison to the cost of a vehicle development program.

A good overview of the different drive cycles can be found here: http://www.car-engineer.com/the-different-driving-cycles/


"A computer science analogy to this would be if I gave you an uncompressed image and you had to develop a compression algorithm that made the image as small as possible, you could likely come up with a really good solution. But your algorithm most likely wouldn't do as well on any other image."

That analogy doesn't hold. It's more akin to developing two compression algorithms, one for the general case, and a specific algorithm which is used only when your image is detected for better than the general compression performance use case.

It is entirely fair to blame the manufacturers for this. Gaming emissions results required effort to accomplish, and is completely unethical from an engineering standpoint.


I agree that the analogy doesn't hold for the recent VW debacle (where the calibration was changed during certification testing), but it holds for the industry in large and what has been going on for the past 10-20 years, which is what the paper is about.

The vehicle manufacturers optimize the engine calibration to the drive cycle they are trying to beat. That is why a US-spec BMW has a different engine tune than a Euro-spec BMW for example, the drive cycles are different.


Well done! Just one thing I noticed, the speed is mph and the distance unit is meters. Would be nice to change speed to km/h.


Articles like this just confirms my theory that car journalists are the biggest obstacle for the automotive industry to truly innovate. It is as if they dictate the requirements for the general public.

What the author completely seem to ignore is how turbos are key to the entire downsizing trend that is drastically changing the cars. With lower mass, less inertia and less internal friction, going back to first principles is what is making a difference these days.


This is an article for people that like to drive, and he is correct: you want a linear response in turns. He is not dictating requirements for the general public --- he is describing reality.


Nice implementation - less is more!

What algorithm did you use for color extraction? I seem to getting colors with the correct hue, but too low saturation? (example: Ferrari)

You could also imagine to not show colors with very low saturation (ie. white/gray/blacks). I've previously used Colorific [1] which seems to address some of those things!

[1] http://99designs.com/tech-blog/blog/2012/05/11/color-analysi...


I'm also curious about the algorithm, since a search for "google" results in a palette that only features one color out of the company's primary-plus-green motif. Seems like backgrounds (white in the case of Google, black in the case of demotivational posters) are weighted a little too strongly, perhaps.


Fair point,

Nonetheless, I felt the topic was worth discussing. In my experience "teams of very talented people" can often turn into "teams of very competitive people".


> Nonetheless, I felt the topic was worth discussing.

Oh, I agree. But its source is troubling -- it's put forth as though it has academic depth and evidence, when in fact, it's just another pop-psychology tract.

For some reason, in modern times things like this, things that can never be more than opinion, are dressed up as though they're backed by science. Among uneducated people this gives them an unearned weight.

It's not as though an idea like this is false or unworthy of discussion. It's that it can never be either established as true, or falsified. This means its discussions tend to be endless and inconclusive.


I have not read the full article (pay wall), but I have definitely experienced this in professional settings. I'd be interested to hear other people's thoughts on this, especially with startup's desire to build "killer teams"!



Did you consider doing spline fits to the x,y coordinates and then just saving the spline coefficients instead?

The fit process could likely be done client-side, and could offer some subtle smoothing if wanted?

Anyhow, really nice execution!


paper.js has a nice method to do this. I actually used it in a whiteboard app I made as a proof of concept. Rather than saving every X,Y coordinate, it uses a configurable number of points and handles and makes bezier curves. http://paperjs.org/examples/path-simplification/


Although this article discusses the actual collection and transmission of data, what I find more interesting is how efficiently the vast amount of data is being used in the decision making process for vehicle setup and race strategy. (I strongly believe that data has no value until it is used for analysis and decision making.)

Data coming from the vehicle sensors and other sources are concurrently being used for analysis (by both hundreds of engineers and a wide range of "algorithms") as well as input to simulation models. The results from the simulations means that even more data is being generated even when the vehicle isn't running.

Within minutes, using the data, the different engineering groups (typically responsible for a sub-system, ie. engine, tires, aerodynamics) arrive at conclusions which then the vehicle's race/performance engineers are using to enhance the setup of the vehicle. The results of changes are then fed back to the engineers, and evaluating if the analysis and predictions are correct is a big part of the post-event work.

I honestly can't think of any other industry that carries out this kind of analysis of highly non-linear systems at this scale and speed. The only other industry that I can think of is finance?

Disclosure: I work in motorsports


This is done during flight test of prototype aircraft (and probably spacecraft as well to some extent, I haven't done that) - the data acquisition system on a flight test plane is incredibly advanced, and the data gets beamed down in realtime to a group of (real) engineers on the ground in the flight test station who analyze the data and communicate back and forth to the pilots again in real time. All in real time, the test pilots will decide what/how maneuvers to do based on learnings captured/analyzed, change the configuration of the aircraft systems/surfaces, intentionally induce failure conditions and faults, ... It's a real ballet between the engineers on the ground and the pilots, when the team is working well together. As you said, post-event debriefing, detailed analysis, recommendations/reports, aircraft changes etc are started immediately when the aircraft lands, to prepare for the next flight. It's very intense, but also very rewarding and a lot of fun.


beamed down in realtime to a group of (real) engineers on the ground

I assume this is in reference to use of the term "race engineer" in F1.

I can assure you that the sport does use real engineers, and very intensely so. Folks like Adrian Newey would qualify as engineers by anyone's definition, and each team has heaps of MechE's, EEs, software engineers, even materials engineers, back at home base.


.. not to mention that there's a lot of overlap between aerospace and F1 engineering. Plenty of the engineers (even the trackside ones) started out studying aero engineering.


Yes, that makes sense, thanks for the insight.


Power grid management is of similar complexity and it's a 24/7 job. Everything from current temperature to when TV shows end impacts the grid. Add to that just how expencive peaking power is and the incentives for extremely accurate modeling become significant.


Of course, and I'm sure dealing with the many different data sources (ie. weather and TV listings) makes it even more complicated!


Do you know if anything similar is being done in MotoGP? I know the communication with the riders is much more limited, but I'm curious what they can get and send back to the bikes themselves, perhaps communicating via the instrument panel.


I also find this fascinating, but have never seen any good technical discussion of it. Do you have any pointers?


the more interesting network side is the link between car and garage - renting a high bandwidth private circuit for sports events is SOP these days and has been going back to the 50's or 60's.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: