Hacker Newsnew | past | comments | ask | show | jobs | submit | mwbajor's commentslogin

They're just ratios....


Im a HW engineer and don't really understand "complexity" as far as this article describes it. I didn't read it in depth but it doesn't really give any good examples with specifics. Can someone give a detailed example of what the author is really talking about?



System Thinking 101


We use this website for diving off the US East coast.

https://www.facebook.com/p/Eastern-Search-Survey-10006355294...

ESS has mapped and imaged many ships off the eastern seaboard, some sunk by U-boats during WW1 and WW2. The images and data collected are excellent.


The observation of the Hubble constant requires us to measure distance to an object in space. This is very hard to do at the extreme distances required (https://en.wikipedia.org/wiki/Parallax). In the end, the variation in the Hubble constant might be only due to our limited accuracy in measurement.


All definitions of entropy stem from one central, universal definition: Entropy is the amount of energy unable to be used for useful work. Or better put grammatically: entropy describes the effect that not all energy consumed can be used for work.


There's a good case to be made that the information-theoretic definition of entropy is the most fundamental one, and the version that shows up in physics is just that concept as applied to physics.


My favorite course I took as part of my physics degree was statistical mechanics. It leaned way closer to information theory than I would have expected going in, but in retrospect should have been obvious.

Unrelated: my favorite bit from any physics book is probably still the introduction of the first chapter of "States of Matter" by David Goodstein: "Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics."


That would mean that information-theory is not part of physics, right? So, Information Theory and Entropy, are part of metaphysics?


Well it's part of math, which physics is already based on.

Whereas metaphysics is, imo, "stuff that's made up and doesn't matter". Probably not the most standard take.


I'm wondering, isn't Information Theory as much part of physics as Thermodynamics is?


Would you say that Geometry is as much a part of physics as Optics is?


Not really. Information theory applies to anything probability applies to, including many situations that aren't "physics" per se. For instance it has a lot to do with algorithms and data as well. I think of it as being at the level of geometry and calculus.


Yeah, people seemingly misunderstand that the entropy applied to thermodynamics is simply an aggregate statistic that summarizes the complex state of the thermodynamic system as a single real number.

The fact that entropy always rises etc, has nothing to do with the statistical concept of entropy itself. It simply is an easier way to express the physics concept that individual atoms spread out their kinetic energy across a large volume.


I'm not sure that's quite the right perspective. It's not a coincidence that entropy increases over time; the increase in entropy seems to be very fundamental to the way physics goes. I prefer the interpretation "physics doesn't care what direction the arrow of time points, but we perceive it as pointing in the direction of increasing entropy". Although that's not totally satisfying either.


This definition is far from universal.


I think what you describe is the application of entropy in the thermodynamic setting, which doesn't apply to "all definitions".


Because they would assume he has American and British friends that he still might talk to.


"....why doesn't that throw in every EDA tool?"

This would require repetitive SPICE simulations, or basic rule checking at the very least. Nobody does full SPICE simulations at the board level however basic input/output port checking (usually in the ERC check) does get performed. Even with RF designs, you carve out the piece you need to examine or design and simulate that. For the chips I've worked on, the full chip would get a SPICE simulation that would take days/weeks but this was for more R+D oriented mixed signal designs. I guess what I'm saying is the simulation of a circuit is best performed as a deliberate, iterative step in the circuit design process.

When it comes to layout however, you do get hints from the DRC checking tool (Design Rule Constraints) that will tell you if your trace is drawn incorrectly based on the DRC constraints and nowadays sometimes from an EM simulation that can be run in the background.

Completely automated design especially for analog will most likely never be a thing for the other reasons you list. However, I already can use "known good" circuits and modularize them for reuse which does speed things up. This is critical in the ASIC world due to the large hierarchies in the design. Modular reuse is also a growing tool in the PCB world. Cadence now has a very nice module/reuse tool that can even detect and create modules to prevent you from having to redraw the layout for a sub-circuit multiple times if its not instantiated as a module already. I always like when more people want to get involved in HW, but what the OP is showing largely exists in the form of TCL and SKILL scripts in current EDA SW packages.


I think a major question is WHICH current EDA SW packages.

Cadance OnCloud Platform for PCB Design is $1500/month. Altium Designer is $11k.

Most of the universe is in the world of KiCAD, Eagle, GEDA, and similar.


I work in analog,

1) Noise is an issue as the system gets complex. You can't get away with counting to 1 anymore, all those levels in between matter. 2) Its hard to make an analog computer reconfigurable. 3) Analog computers exist commercially believe it or not, but for niche applications and essentially as coprocessors.


Quantization of parameters in neural networks is roughly analogous to introducing noise into analog signals. We’ve got good evidence that these architectures are robust to quantization - which implies they could be implemented over noisy analog signals.

Not sure who’s working on that but I can’t believe it’s not being examined.


You're never going to find a solid answer to these questions. The best you can do is ask yourself "does this seem reasonable".


That is absolutely fine to me. (I mean, I'd like answers, but I've gotten used to the idea of limits to our knowledge.) What I appreciated so much is that is broadened the range of potential answers for me, emphasising just how much I'm never going to find solid answers.


Doesn't sound like science at all, does it?


Why would it? It’s anthropology


Bandpass sampling is a regular technique used in the RF world to mix down a signal to its final IF or baseband frequency, digitally. In this case, you are undersampling but using a well defined signal that you have apriori knowledge of where it is before and after sampling.

https://en.wikipedia.org/wiki/Undersampling#:~:text=In%20sig....


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: