Recent Topics

[1.9.x] Javascript use in blog templates and custom skins

Started by on Apr 15, 2007 – Contents updated: Apr 15, 2007

Apr 15, 2007 22:59    

My b2evolution Version: 1.9.x

EdB wrote:

That's pretty nice code for selecting a random background image, including the location to put it at...I disagree on the "everyone should allow javascript" thing.

Thanks for the kind words about the random background code EdB. It's appreciated.

But since the topic of using javascript in b2evolution is a bit out of context with that thread (http://forums.b2evolution.net/viewtopic.php?p=53838#53838 ) I thought a new topic would be more appropriate to discuss your thoughts about javascript security, and why you think using a browser option to disallow javascript (outside of Accessibility issues) is a reasonable client practice.

Apr 16, 2007 03:33

Sure, but I'm going to move this to "chat away" because it's not a skin-specific issue.

http://www.google.com/search?hl=en&q=malicious+javascript&btnG=Google+Search ought to about cover it. Therefore http://noscript.net/ is absolutely required on my Firefox.

Sometimes some things are best done with javascript. No problem: if I know I need javascript enabled I'll enable it for the site I'm on. Usually temporarily, sometimes forever. For example b2evolution.net is on my list of sites I allow javascript from. google-analytics.com and googlesyndication.com are not, so I see http://b2evolution.net/about/monetize-blog-money.php differently than someone who allows javascript from anyone anytime. Because Francois is hip to the idea of "graceful degradation" it shows me text via the noscript tag that tells me I'm missing something. Most sites that depend on javascript do not degrade gracefully, and to me that's really annoying. Especially when I fill out a form and click "submit" and nothing happens. Would it have killed the coder to put in a noscript tag telling me "this form depends on javascript"?

Another thing to think about is bandwidth and download times. Imagine if you want a random quote on your page. That was one of the topics in the original thread, so it's a good one to work with. Using javascript to pick one I would send your browser 50 quotes so that you could see ONE of them. You probably wouldn't even notice the time it took to send them on a fast connection, but what about your visitors on dialup? Believe it or not that's not an extinct form of connecting to the wild wild web. Isn't it much nicer to pick the quote on your server and send only the one that visitor gets to see?

So it's safer for the visitor to not trust every site out there, and it's nicer to the visitor if you do your processing on your server. Whenever possible! If javascript is actually the best possible solution to a design issue then of course you should use it, but you should also degrade nicely and tell your visitor "hey you're missing something".

But hey that's just my opinion!

Apr 16, 2007 03:50

It may be covered in EdB'sr google search but also consider that some/many corporate outfits have JS switched off to help ensure their network security.
They could be a potential client or customer .

After reading EdB's post I'm off to check my form Validation scripts :)

Apr 16, 2007 09:15

The Javascriptless EdB wrote:

Especially when I fill out a form and click "submit" and nothing happens.

Or when you hover over links and all you get is "javascript:some_crap_function();" ..... the good news is, the close browser button does not require javascript :D

¥

Apr 16, 2007 13:26

EdB wrote:

Sometimes some things are best done with javascript. No problem: if I know I need javascript enabled I'll enable it for the site I'm on. .... Because Francois is hip to the idea of "graceful degradation" it shows me text via the noscript tag that tells me I'm missing something. Most sites that depend on javascript do not degrade gracefully, and to me that's really annoying. ..

Another thing to think about is bandwidth and download times. ...

So it's safer for the visitor to not trust every site out there, and it's nicer to the visitor if you do your processing on your server. Whenever possible! If javascript is actually the best possible solution to a design issue then of course you should use it, but you should also degrade nicely and tell your visitor "hey you're missing something".

But hey that's just my opinion!

Hello EdB,

You make excellent points about the use of javascript and the lack of attention that a lot of developers pay to degrading from that state gracefully. But I think that is more about programmer laziness rather than some deficit in the language used.

And surely, server side generation can handle many tasks such as table lookup problems in a more efficient manner than can client side javascript. The downside of that approach though, is that putting more load on the server processing cycles is not always a good idea. Especially when many tasks can be more efficiently handled by distributing some of the load (even asynchronously) to client computers.

But those subjects are quite different from your disagreement with my statement that "..not everyone permits javascript (although they perhaps should)."

I find it hard to believe that anyone would seriously challenge or discourage the use of client side javascript in developing highly interactive, state of the art web pages for today's market. I also believe that the security considerations designed into both today's browsers and the javascript language itself are quite effective in protecting the community from either real or imagined malware.

Taking this a step further, consider the universal acceptance of javascript by the most prominent websites on the globe. Do you think it is really a good idea to suggest that you should assume their audience would not allow javascript?

Let's not let the FUD (fear, uncertainty, doubt) factor influence our design of these systems simply because someone says that cookies and pop up windows present a security threat, or perhaps the world will crash because of the time formats of the year 2000.

Perhaps it's time to build our pages to satisfy the majority needs of our audience, but at the same time give consideration to needs of the the handicapped audience that require special Assistance.

Apr 16, 2007 14:03

needs of the the handicapped audience that require special Assistance.

You say may say FUD... I say the above statement is condescending.

Who's writing the malicious scripts?
Who's exploiting poorly written Javascripts?
Who's suffering the effects ?

Handicapped people????

Sorry, but exploits are not aimed at handicapped people, they are aimed at anybody including my Aunt, who likes to dabble in Botany on the Internet but wouldn't know a thing about extensions or browsers or sytems built by experts who get hacked month in and month out by kids and experts alike.

BTW, my Auntie is also not handicapped.

Apr 16, 2007 14:25

I never said there was a deficit in javascript! All I say is that javascript can and is used maliciously. Blindly granting every website out there with the right to execute scripts on your computer is, over time, asking for a problem. By the way it's my computer. You, meaning any website developer out there, do not get to decide how much of my computer will be dedicated to your processing needs. So no: everyone should not permit javascript. In fact I'll go so far as to say any browser that, by default, allows javascript is insecure and has no respect for the actual computer user.

um... what do you consider "the most prominent websites on the globe"? If you are referring to google, I do not allow google to run javascript. The search engine works just fine without it. On the rare occasion that I need to go to gmail I will temporarily allow them to run their scripts, mark what I need to as spam, then disallow them. Youtube is quite popular, and requires javascript. They are kind enough to tell you that thanks to their use of the noscript tag. If for some reason I had never been to youtube I would see that and say "okay I'll temporarily allow this" then decide if the experience is worth permanently allowing. In my case I think it is, so I do. Further I will say that for a popular website to assume javascript will be allowed doesn't mean I, or anyone else, has to allow javascript. If the website does not display properly without javascript (or images or flash by the way) then it is a poorly coded website and is probably only popular because most people use IE and have no idea what they do not need to put up with.

It's been years since I've seen a popup ad. Oh and those stupid floating layer things are a thing of the past - on my computer. I also don't see third party ads. Nice eh? Unfortunately many webs have gaping holes in them where they want to throw junk at me. Mostly advertising, and mostly news sites. I use them for their content though so it's okay.

Gosh expecting me to allow whatever the web author wants would be like a store REQUIRING me to browse every aisle before I buy the product I went shopping for.

As to FUD: get real. The simple facts are that some people will include malicious scripts in web pages because they find a reward of some sort in doing so. There is no fear, uncertainty, or doubt involved. The chance that I will stumble upon a malicious script in my random surfing is orders of magnitude less than someone who globally allows javascript. In fact it simply won't happen because I do not allow sites to run scripts.

boblennon wrote:

Perhaps it's time to build our pages to satisfy the majority needs of our audience, but at the same time give consideration to needs of the the handicapped audience that require special Assistance.

That is completely unrelated to the subject here, and is also a damned good idea. Do you know how your site looks with javascript and images disabled? Do you know what it will look like on a portable device? How will a screen reader (for the blind) interpret your web? What happens to your web when someone increases the font size twice? Do you always supply alt text with images? How about titles on links? I wish I could answer YES to all of these, but I can't because I don't know. I try to find out by testing as much as possible in as many ways as possible - and make appropriate corrections to whatever file needs tweaking.

I just now threw out my skin and selected one of the skins from the skins site. I threw mine out because at work (800 pixel wide monitor using IE) it looked like crap. I therefore will have to go through many files to add alt and title attributes because I firmly believe they should be there. Oh and I picked a skin that uses javascript and doesn't have a noscript tag! The script is for dynamic text replacement. You won't even know that your missing something if you don't allow javascript from my site, but in truth you would be missing ... nothing. Without javascript enabled you see letters. With javascript enabled you see images instead of letters. Not a very big deal, but - to me - it absolutely requires a noscript tag telling the visitor that I use javascript to put my blog title in a fancy font. That way my visitors who use FF with NoScript will (a) see that a script is being blocked thanks to the NoScript extension and (b) know what the script they are blocking does. It is up to them at that point to decide if they want to see these images or not because, you see, it is their computer. They get to decide what is allowed and what is not.

Apr 18, 2007 20:10

Ok i've had a quick skim, and yeah it's a topic that intrigues me.

For the security thing, javascript would only be a problem if a malicious user was able to get javascript onto the site, eg. via user submitted content. A site developer wouldn't try to attack it's users. So for sites like myspace, javascript being disabled would be a fairly nice idea, but for sites like google, no point. Javascript is handy to exploit your own server-side code, as you can tamper with your forms and values. I've only used javascript against javascript once, in which a payment system was helped by javascript, so javascript could be used to screw with the payment system.

Anyway, back to the original topic.

Graceful degradation in javascript is a hard concept to get hold of, i've done a lot of posts about it here. The methods I have adopted is so a javascript enhanced page, works fine without javascript, and is enchanced with javascript. The page is done unenhanced originally, then the javascript loads in new css styles, new scripts that brings out the hidden functionality. I use to make php scripts output two different pages, but that just became unreliable and really just a horrible idea.

Edit here is a snippet of good degrading in my idea:

<head><style type="text/css">.has_javascript { display:none; }</style>
<script src="js.js" type="text/javascript"></script></head><body><p class="no_javascript_notice">enable javascript to bling this site out</p>
<p class="has_javascript" onclick="dosomethingleet();">this is some bling for javascript users</p>


document.write("<style type='text/css'>.has_javascript { display:block; }</style>");
// some code to init javascript, and finish enhancing page

Apr 19, 2007 18:20

The security problem, as you rightly note, is when the developer IS the malicious person. Chances are very good that it will not be a visitor to a website. Rather, it will be a person visiting a site that has either been hacked or is intentionally malicious. Thus I prefer to surf safely and only allow scripts from domains I deem trustworthy.

It is something all web developers and surfers should be aware of eh? Developers should never assume that what they see is what their potential visitors will see. Our browsing tools and habits are not always the same right? Surfers on the other hand should be wary of what's out there. The vast majority of people (sites?) you meet are not out to do you harm, but for the one in one million that are ... "better safe than sorry" seems like a good philosophy.

Apr 19, 2007 19:00

EdB wrote:

The security problem, as you rightly note, is when the developer IS the malicious person.

It's not often I call you wrong, but it doesn't have to be the developer who's the malicious person

1) go visit whoo's blog and visit the "security exploit" ( with the demo blog she gives ) [url=http://www.village-idiot.org/archives/2007/04/17/take-the-test/]http://www.village-idiot.org/archives/2007/04/17/take-the-test/[/url] .... you'll need to enable js for the target domain to see the results
2) read the xss, csrf threads on these forums [url=http://sla.ckers.org/forum/list.php?3]http://sla.ckers.org/forum/list.php?3[/url]

I totally agree with you about surfing safe though, I have js/flash/cookies/gimmicks disabled by default .... so guess what? .... now they can get you through css as well ;) [url=http://ha.ckers.org/blog/20070302/portscanning-without-javascript-part-2-2/]http://ha.ckers.org/blog/20070302/portscanning-without-javascript-part-2-2/[/url]

¥

Apr 19, 2007 20:12

Correct on all points. The thrust of my comment was more geared toward's balupton's statement that implied site owners could be trusted and commenters were not able to do malicious things. In b2evolution when you don't turn off the things that protect you.

(Oh man I can't believe I let a simple path thing trash my visiglyphs!)


Form is loading...

b2evolution CMS – This forum is powered by b2evolution CMS, a complete engine for your website.