Hacker Newsnew | past | comments | ask | show | jobs | submit | ahochhaus's commentslogin

SameGoal Inc | Staff Software Engineer | Madison, WI | ONSITE (Post-Covid) | Full-Time | https://samegoal.com/

SameGoal's web application allows districts to collaboratively and compliantly document student participation in K-12 Special Programs such as Special Education, Gifted Education, English Language Learning and TGRG. We are a privately held, profitable company that operates without external capital to ensure stability and strategic direction.

We are looking for someone with 5+ years of Software Engineering experience ; 10+ preferred, who can architect and develop significant infrastructure projects, debug complex systems to isolate and resolve problems quickly, mentor junior engineers to increase their productivity and foster a highly collaborative environment, develop key new features & functionality end-to-end including modification to our frontend and backend, initiates and leads tactical engineering projects to streamline operations across the company, and can keep our users happy with a user-friendly, low latency, highly stable application experience.

Technology Stack: Backend - Go, PostgreSQL, K8s | Frontend - Closure Tools, SPA | SCM - Git, Gerrit

Competitive salary and benefits package included. FLSA: Exempt. To comply with federal law, SameGoal participates in E-Verify. SameGoal is an Equal Opportunity Employer.

To apply, email cover letter and resume to jobs@samegoal.com


Does using `Content-Security-Policy: upgrade-insecure-requests` in addition to HSTS add value?


Yep: HSTS only applies to your site, while upgrade-insecure-requests applies to every resource your site loads, even on third-party domains? Meanwhile, upgrade-insecure-requests does not replace HSTS because it doesn't help secure links from offsite or direct entry, which HSTS solves especially with preloading. Monitoring CSP headers and actually fixing bugs would help fix things in browsers that don't support upgrade-insecure-requests.


Thanks for the clarification. I did not realize that `upgrade-insecure-requests` applies cross origin. If you do not load any insecure content is setting HSTS and `block-all-mixed-content` the best strategy?


As pointed out by Microsoft earlier today, MDN is one of the best resources on this sort of thing. Here they write:

> The upgrade-insecure-requests directive is evaluated before block-all-mixed-content and If the former is set, the latter is effectively a no-op. It is recommended to set one directive or the other – not both.

Source: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co... and https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...

Yes they are similar, and you have to watch which one you set -- but you can also achieve a similar effect at a more granular level using CSP as also indicated in MDN. These rules are equivalent to saying `default-src https:` in the CSP rule.

In fact, the best option is individual CSP directives which can get more granular than the `https:` scheme alone, because you can then specify which trusted third-party domains (if any) are allowed to load resources on your pages and conditions (like nonces) for running script tags, data URIs, etc. After all, your secure third-party resources could still have servers compromised and they might then send malicious assets over SSL to your unsuspecting users' browsers.

CSP, if trusted enough to set it to block instead of just report (though you can run both modes at the same time), is one of the best defence-in-depth ways to protect your page from attack, right up there with HttpOnly and Secure flags on cookies. https://developer.mozilla.org/en-US/docs/Web/HTTP/Cookies#Se...

If you're looking for checklists, have a look at https://wiki.mozilla.org/Security/Guidelines/Web_Security and https://blog.appcanary.com/2017/http-security-headers.html though remember no checklist is going to deliver bulletproof security on its own (you'll have to inspect your app and environment for flaws, implement monitoring tools, etc.), and blindly implementing security headers or features without knowing what they do can obviously break your app. (Again, monitoring your app can help.)


MS open sourced ChakraCore (the core of the Chakra JavaScript engine). EdgeHTML is still closed to the best of my knowledge.


Google's standard template language (soy / closure-templates) does mininfy whitespace by default [1]. However, it does not omit optional tags. I think this is why the style guide is written as it is.

Still, I agree that both types of optimization / minification should be done by the tooling layer.

[1] https://developers.google.com/closure/templates/docs/concept...


One example of "google only APIs" that I ran into recently is the chrome extension browserAction.openPopup API. This API is whitelisted (in stable) for use by the Google Cast extension. Extensions authored outside of Google cannot use it. This is creates an uneven playing field.

https://bugs.chromium.org/p/chromium/issues/detail?id=436489


That's nothing. One of the biggest injustices with Google Cast is that capturing system audio on Android is a whitelisted system-only permission and only Google Cast (and a couple system apps) are allowed access to this. Somehow screen video capture isn't a system-only permission.

Nobody other than Google (or your OEM) is allowed to build complete screen casting apps on Android. All third party solutions require you to output your audio through your speaker and capture it with your microphone, which is much worse quality, single-channel, and makes sound (I want to hear my device audio through the speakers of the device I'm casting it to).

It makes no sense, especially when any device on your network can emulate being a Miracast receiver and get access to the same audio data that you aren't allowed to access on your own device. In fact, Windows 10 insider preview currently includes a Miracast emulator just for this purpose of Android screen sharing.


It seems it's not necessarily malicious. From https://bugs.chromium.org/p/chromium/issues/detail?id=399859:

"The popup is anchored to the extension icon, which might be in overflow or not even exist, in which case it is anchored to the Wrench menu. That kind of anchoring would make the message in the popup to appear to be from the Chrome browser (since it points to the chrome UI) and would present a vector for tricking users into thinking the message is from a trusted source.

Since this is not safe to allow all extensions to do we'd need a lot better reasoning than "I'd like to use this in my extension" before allowing widespread use of this API."


I agree that the API limitation is most likely not malicious and I did not intend to imply otherwise. Still, lack of malice does not change the fact that the Google Cast extension has a competitive advantage over other non-Google extensions (which can't use all of the same APIs).


This seems remarkably similar to Philips:

"While the Philips Hue system is based on open technologies we are not able to ensure all products from other brands are tested and fully interoperable with all of our software updates. For guaranteed compatibility you need to use Philips Hue or certified Friends of Hue products."

After all, it'd just be downright unthinkable that any non-Philips lightbulb should be compatible with our light sockets, that any brand of plug should fit into our wall sockets, or that non-Google-branded plug-in should be able to use a web browser's APIs... I mean, goodness, next we'll be thinking that the term "plug-in API" suggests its supposed to allow things that other people created to interoperate...


I agree, I would guess extensions also probably don't have access to chrome's password store. Trusted code can be handled differently.


So when are they going to get hit by lawsuits like the ones Microsoft got in the '90s?


They've already been hit with that in the EU. And, I imagine the first thing after this current election cycle in the US ends will be newly minted (or newly defrocked, as the case may be) politicians attempting to take a cheap shot at Google -- especially since Popular Luddism is circling the political backwaters these days.


As already stated, they were already happening, and Alphabet is in large part an initiative to head that off.


This post links to the release notes. The post you mention links to the press release. They contain similar but different information.


Sure, but this is about the new release — I don't see the point of having two threads about the same thing. (The press release links to the release notes, too.)


Fair point. It is reasonable to demote this post.

Personally, I always read the release notes but am less interested in the press release. I think many others prefer the opposite, hence my posting both. You raise a valid point though.


Interesting. Can you offer any details on which production Google apps were presented?


For a bit more context, the technology that allows the client to communicate to the server which hostname it is attempting to connect to is called Server Name Indication.

https://en.wikipedia.org/wiki/Server_Name_Indication


Great feedback. We are too light on details. I'm adding a more detailed description shortly.

The original idea with the video was to show the types of functionality that FormBig exposes (eg: integrated logins, JS-based logic, rich text editing, etc) rather than the details of how one achieves them. However, I think since we went that path it is important to get more details in the body of the page.

Thanks for the pointer to MS Lightswitch I'm learning more about it now.


We don't have a final pricing model figured out yet, but we would like to support applications with a variety of needs. If we had a per-form based pricing option would that meet your needs?

Can you describe your use case a bit and the other platforms you have looked at thus far?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: