You're correct that the script tag will not show. However, we train our testers to use special characters, including < and >, in their test data. It happens that the environment in which we spotted the vulnerability was our QA environment.
The 3D view definitely makes it a bit more visible, but as someone who has spent a considerable amount of time testing XSS filters, it's not all that useful, since you generally know exactly where in the output your input will be, and also because looking at the raw output (not the constructed DOM tree) is a better way to identify XSS vulns.
It's a cool observation nonetheless, and props for catching XSS vulns in your QA environment, not production ;)
Agreed. The 3D view is definitely a cool thing no doubt, but I'm not really seeing it as much use for preventing XSS vulnerabilities. That being said, I think the author of the post recognized this and was just sharing that it could rarely help.
Agreed. It's good to be aware of it as a developer.
In our case it was lucky that we happened to view a page with a vulnerability, and there happened to be data that would be interpreted as an HTML tag. However, I believe in probability. Nothing is certain and so we try to put in place practices that will increase the probability of finding errors, etc.
The 3D view is just another little wrench in our toolbelt that increases our odds.
Although this is correct, the real way to handle XSS vulnerbilities is to default deny and encode all data, like Django does. Require the developer to whitelist some parts.
Relying on injecting the data and then looking for it in a fancy 3d view is not a very robust way to do it.
Using Perl, I love how Text::Xslate does exactly that by default, unless you "| mark_raw" the data that you know needs instead to be interpreted as HTML. That's the way to do it!
But not really, because most of the time, the "content" injected into your page is a script tag, which doesn't show up in the 3D view.