Hacker Newsnew | past | comments | ask | show | jobs | submit | afvictory's commentslogin

These large funds are pretty diverse, check out the letter Sequoia sent out to their LPs regarding FTX [1]. Sequoia's 150M cost basis for their FTX investment only accounts for 3% of the committed capital of the fund.

[1] https://twitter.com/sequoia/status/1590522718650499073


I'd be interested in a Houston meetup!


Same!

Is it weird to post this from a throwaway acct?


I haven't run into the visual issues you mention but I've definitely run into a lot of problems regarding failure to wake.

Currently using: 2 Gigabyte M28Us (scaled to 1440p) and 1 Dell S2716DGR, all running at 144hz on various different Macs. One of the M28us is passing through the mouse, keyboard, and webcam.

I don't have any issues with the 2017 era MacBook Pro using random display port to USB-C cables I bought off amazon.

I did have a bunch of issues with the M1 Pro MacBook. I used it exclusively in clamshell mode and 90% of the time it failed to wake up and detect the monitors in general. To get it working, I had to continuously unplug and plug the cables back in. Sometimes this worked on the first try, other times it took 10 minutes of messing with the cables. I upgraded to using certified Thunderbolt 4 cables on the M28Us but that didn't fix the problem.

Currently I'm running a baseline Mac Studio. The biggest issue I have is that the display order changes most of the time when I wake the computer. Every now and then it seems to not pick up one of my displays and I have to unplug it and plug it back in. It's a little bit annoying but definitely not as frustrating at the M1 Pro MacBook.


> but I've definitely run into a lot of problems regarding failure to wake.

In starting t belove there's just some collective failures with modern laptop power design. Multiple laptops from different manufacturers have had similar power issues waking up from sleep states and many even went on to just straight up die, as nothing seemed to power up anymore.


Small world, maybe our paths crossed! I had the pleasure of working with the govinfo team from 2015-2017 as they were rolling out the beta and finalizing the migration from the old fdsys.gov. Truly great folks that are passionate about maintaining government information and making it as accessible as possible.


Thanks for your work! (If our paths crossed you would have known me as the persistent guy at GovTrack asking for bill status XML over and over. :) )


There are also no results for "tank man" in Apple's GIF/#images tool in iMessage. There is a GIF present in Messenger's GIF tool which uses Giphy/Tenor.

Edit: The same is true if you search "Tienamen"


Roscosmos has shared before & after satellite imagery for anyone interested: https://twitter.com/roscosmos/status/1291023063404994560/pho...


Regarding the satellite image, a few questions :

1) Can nation-states spy like this on every location on the planet?

2) Can you view the live feed of any location and follow vehicles(for example)?

3) Is there anything a country can do to prevent other's from spying on your country from satellites?


1. Yes. Not just state actors - Planet Labs (https://www.planet.com/) in San Francisco, for example, is a commercial satellite company that photographs the whole globe once per day and sells that imagery online.

2. Live feed is a bit tricky, and that's where governments have an advantage - they own their own satellites, and can task them to follow a specific target. But you have to know where the target is at the start of the window, they don't have real-time video of the whole planet, and unless you've got a very big fleet you won't always have a satellite overhead when you want to look at your target.

3. Keeping track of the times of satellite passes overhead, hiding stuff underground, putting your aircraft in covered hangars and only moving them at night, putting a roof on your military docks, using upwards-facing camouflage, etc. Same methods that have been used for a hundred years to hide from air surveillance.


  "Planet Labs ... photographs the whole globe once per day..."
This is not an accurate representation of Planet Labs capabilities. I can attest to this as a Farmer's Edge customer.

In the theoretical world where every inch of the earth was photographed every day things like losing MH-17 likely wouldn't have happened.

ETA: Planet claims "entire landmass" every day but I find that claim extremely suspect, but it does not even claim the entire globe.


> In the theoretical world where every inch of the earth was photographed every day things like losing MH-17 likely wouldn't have happened.

I think you mean MH370 instead of MH-17.

https://en.wikipedia.org/wiki/Malaysia_Airlines_Flight_370


What is your experience of the frequency of Planet's reimaging? And at what resolution?


Somewhere between weekly and every 3 days, occasionally worse. I do not know the exact resolutions involved but they have "good" and "bad" NVDI images and the "bad" ones are fairly useless for agricultural applications.


To expand on 3: most spy satellites operate in sun synchronous polar orbits. This is so they can cover the entire earth and will have their solar panels lit by the sun. This also has the ramification that they'll always be in the same spot in the sky depending on the time of day. If you know this time, that's when you hide your tanks and don't fly your top secret spyplane


Even if they're not in SSO, satellites are hard to hide. The timetables might be more complicated for a rank and file soldier to use, but you can time your actions to be missed by any specific low-orbiting satellite.


"will have their solar panels lit by the sun" is probably not the primary reason for sun-synchronous orbits (though I guess it is presumably a benefit) - it's more about having a constant source of illumination to enable photography, as well as the ability to photograph the same site with the same shadow angle over time to facilitate comparison.


> 2. Live feed is a bit tricky, and that's where governments have an advantage - they own their own satellites, and can task them to follow a specific target. But you have to know where the target is at the start of the window, they don't have real-time video of the whole planet, and unless you've got a very big fleet you won't always have a satellite overhead when you want to look at your target.

I would really not be surprised to see the NRO go the Planet Labs / Starlink route at some point in the future and put 20 satellites in a dozen orbital planes to provide coverage everywhere at less than 60 degrees latitude 24/7/365. Sure, maybe you need the big KH-11s for the extremely high resolution shots - but 3m resolution isn't exactly trash either.


At 3m resolution a tank is about 3 pixels; a sedan is around 1 pixel. It's very hard to distinguish objects from each other and therefore to track one at that resolution.


> unless you've got a very big fleet

Starlink


With synthetic aperture radar imagery it's also possible to "peek" under overhead camouflage somewhat.

https://www.38north.org/2020/01/sinpo010320/


I don't think you can get a live feed of a location since the satellites generally aren't geostationary (geostationary orbit is too far away, and are always above the equator). But AFAIK even commercially you can get still photographs of any point on earth within a couple days (i.e. you don't have to be a nation state or have your own satellites)


a live feed to follow something would 99% of the time be implemented from a HALE (high altitude/long endurance) group 4 size UAS. RQ9 or in the same class, with a combination of a high end gimbaled camera system and satellite data uplink.


Geostationary is very very far away and useless for nighttime. Imaging satellites usually have a shorter polar orbit that precesses east to west along with sunshine. https://en.wikipedia.org/wiki/Sun-synchronous_orbit


1) Yes.

2) Well, I can't, but the capability exists.

3) Dig.


Satellites fly unpowered and frictionless in the vacuum, with kinetic energy equivalent to speed of Mach 22 at sea level, balancing against Earth’s gravitational pull, along an own 2D ellipse with one focus at the center of the gravity of the Earth.

So you can’t like fly a 3D orbit, fly at same altitude but different angular velocity, or change course without expelling significant amounts of mass. You play by those rules.


2) costs lots of money, or you have to be a government.

following vehicles is near impossible commercially (resolutions are too low)


2) is more of a drone thing


Drones require you to get permission or vioalate the airspace of the host country.


Or a good relationship with your neighbours or a target close to international waters. What’s a dozen extra kilometres when you’re that high anyway.



If you zoom in on the highways, you can see RGB ghost images of moving vehicles. Is that an artefact of using 3 distinct filtered inputs or something else?

[ps. The frequency shift is definitely reflecting vehicle direction - right side of highway is BGR, the other side is RGB. Delays in the sequence of 3 filtered captures?]


Yes, the different bands are not collected at the same time across the whole scene, so you get ghosting with moving objects. The SkySat sensor [1] is split in two halves: panchromatic and RGBN (where N is near-infrared). It takes lots of captures as the satellite travels that are later aligned and combined into one version.

[1] https://directory.eoportal.org/documents/163813/5615117/SkyS...


Looks like it was pretty fortunate to have been in that warehouse versus further back. A pretty good amount of the direct blast radius looks like it was water.


Jesus, it left a CRATER!


Look how much there was in that warehouse https://cdn.ren.tv/cache/960x540/media/img/26/fb/26fb305fdd5... this was all on Ukrainian tv years ago when the crew was stranded there


The 800 tons of Ammonium Nitrate that exploded in Tianjin in 2015 left a 100 meter wide crater. In Beirut it was 2750 tons, almost three and a half times as much.


Land reclamation.


"Before" is dated 2019.

All the diff is to be atrributed to this blast? Am I missing anything obvious here?


Sibling has posted a link to a higher res version from a different source. I think the before there is closer in time.

The structure to the far left on parents before picture is apparently removed before the explosion.


It's been a while since I have worked in this space but the underlying protocol (AIS) is incredibly lacking in terms of security due to the nature of plain text transmission & lack of authentication. I'm not sure if these issues have been addressed but below is some great research from 2014 on the matter [1][2].

[1] https://www.blackhat.com/docs/asia-14/materials/Balduzzi/Asi...

[2] https://www.youtube.com/watch?v=5rt9dzu3I7U


The payload length for a single AIS slot is 168 bit. So there is really no room for any kind of security headers.


I definitely believe that there is a need to transition college programs into an online format, or at least present the option to college students. It is, however, tricky for various fields of study. While for computer science it's definitely easy to offer an online curriculum, I would say the value for a lot of the physical sciences comes from the hands on experience. This is the case for business students as well, where most of the value of in person classes is derived from the network that one builds. It's much harder to build those kinds of relationships without engaging with other students face to face.

That being said, in my experience regarding the computer science perspective, having mandatory in person lectures added little to no value in terms of actual knowledge gained. In fact, I would argue that not being able to participate online was a detriment to my personal college experience. In order to avoid taking on massive amounts of debt, I had to work full-time while doing my undergrad. I was lucky enough to land a full time development gig my sophomore year, but it required being in the office 80% of the week. Between work, commuting, and sitting in a lecture hall, I had very few hours left in the day to actually work on my assignments and this caused me to make sacrifices in order to manage my time.

There was nothing communicated in our lectures that could not have been communicated in a pre-recorded video or via a class forum. The option to participate online will open up doors for many individuals who might not have been able to pursue higher education before, such as those who have to work or take care of their families.

The article mentions the OMSCS program at Georgia Tech [1], and I can't recommend this program enough. I'm currently about half way through the curriculum, and the format of the program alone addresses almost every issue I've had with my undergrad. I am able to watch lectures and do projects at my own pace which makes managing my time significantly easier. It also comes at 10% of the cost of my undergrad and that's a price that's hard to beat.

[1] https://omscs.gatech.edu/


Congrats to the Hasura team!

Hasura is truly a marvelous piece of software when it comes to rapidly prototyping and getting things out into production. We discovered Hasura a couple years back at my previous company when it was still in beta and we were amazed with how easily we could automate most of our CRUD logic. I will definitely be using Hasura again when the time comes around.

On a more recent note, I currently have a client who's prototype was built with Hasura but one of their requirements was to migrate to AWS and use as many of the PaaS offerings as possible. We ended up using AppSync[1] and it is fairly impressive. I highly recommend anyone who is stuck in the AWS ecosystem to check it out. AppSync integrates with a lot of other AWS services (Cognito, S3) very easily and allows you to use Dynamo/Aurora/RDS/Elastic as data sources. On top of this, you can also use Lambda to implement resolvers that need more intense business logic, making the service incredibly powerful.

[1] https://aws.amazon.com/appsync/


We used appsync at my company and switched to a more classical approach of handwritten crud, a REST Api, a RDS (not entirely sure how well you can use them with appsync) away from dynamodb. Appsync was just not mature enough, had bugs and severe limitations. Without someone in the team who really knows Appsync, I would recommend against it and towards the tech you know.


What was the biggest hurdle you ended up facing that ended up being the dealbreaker for your team?

I think for us, the biggest issue so far has been figuring out some of the more involved Velocity templates as the documentation is fairly sparse and testing them can be a pain. Thankfully this project isn't too business logic heavy but due to time constraints, we ended up writing Lambdas for templates we could not figure out quickly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: