Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Firefox 18 Beta Out With IonMonkey JavaScript Engine

Unknown Lamer posted about 2 years ago | from the up-to-11 dept.

Firefox 182

An anonymous reader writes with a quick bite from The Next Web about the latest Firefox beta, this time featuring some under-the-hood improvements: "Mozilla on Monday announced the release of Firefox 18 beta for Windows, Mac, and Linux. You can download it now from Mozilla.org/Firefox/Beta. The biggest addition in this update is significant JavaScript improvements, courtesy of Mozilla's new JavaScript JIT compiler called IonMonkey. The company promises the performance bump should be noticeable whenever Firefox is displaying Web apps, games, and other JavaScript-heavy pages."

Sorry! There are no comments related to the filter you selected.

So it'll actually be respectable on Facebook? (0)

VanessaE (970834) | about 2 years ago | (#42102421)

Heh, subject says it all.

Also, first post?

Re:So it'll actually be respectable on Facebook? (4, Funny)

zrbyte (1666979) | about 2 years ago | (#42103153)

aaaand I'll be abel to boot Linux [bellard.org] faster!

Re:So it'll actually be respectable on Facebook? (1, Interesting)

hairyfeet (841228) | about 2 years ago | (#42103351)

Its horrible performance on FB is one of the reasons my customers are using a Chromium variant right now. Does anybody know how it does as far as CPU loading? I have to support a lot of low power systems and since around FF V6 its been completely unusable, especially for watching SD video, but even opening new tabs can cause FF to slam the CPU to 100%.

Anyway i don't mess with beta software anymore (enough bugs in the releases, thanks ever so) but if anybody has a low power AMD bobcat or Intel Atom I would like to know how this compares to Chrome.

Re:So it'll actually be respectable on Facebook? (0)

Anonymous Coward | about 2 years ago | (#42103477)

blah blah blah my customers blah blah blah buy AMD

Re:So it'll actually be respectable on Facebook? (0)

Anonymous Coward | about 2 years ago | (#42103957)

wrong person dipshit.

Re:So it'll actually be respectable on Facebook? (-1)

Anonymous Coward | about 2 years ago | (#42104291)

Look at his (read:your) post history, faggot. All you...er he ever does is regurgitate shit about your...er his "customers" little old Mrs. Smith and harp on about how great AMD and Microsoft are.

Re:So it'll actually be respectable on Facebook? (0)

rishistar (662278) | about 2 years ago | (#42103609)

There will be issues on 64-bit Windows as it seems 64-bit Firefox isn't being supported anymore...
http://arstechnica.com/information-technology/2012/11/64-bit-firefox-for-windows-should-be-prioritized-not-suspended [arstechnica.com]

Re:So it'll actually be respectable on Facebook? (1)

dririan (1131339) | about 2 years ago | (#42103639)

That was about the 64-bit build, running exclusively in 64-bit mode. Running the 32-bit version of Firefox on 64-bit Windows is still fully supported. The big problem with 64-bit Firefox on Windows was that, unlike Linux, plug-in developers (read: Adobe) didn't port their plug-ins to 64-bit, and only released them in 32-bit variants.

Re:So it'll actually be respectable on Facebook? (3, Informative)

serviscope_minor (664417) | about 2 years ago | (#42103789)

I have to support a lot of low power systems and since around FF V6 its been completely unusable, especially for watching SD video, but even opening new tabs can cause FF to slam the CPU to 100%.

Well, yes. I have an eee 900 (which is single core). I found firefox and chromium both to be pretty pathetic at playing video. Even in HTML 5. Actually, I'm not sure how they manage to make such a meal of it. It's really terrible. They're basically no better than flash. Perhaps even worse.

I used to use flashvideoreplacer, but that doesn't work any more. I now use a greasemonkey script which basically replaces youtube videos (perhaps others?) with MPlayer, and it can decode 720p just fine in realtime.

Firefox can slam the CPU to 100%. Chromium is multiple threaded, so it can slam the CPU to 100% with a load average of 57 making the machine grind completely to a halt.

Chromium feels a bit snappier in that the UI elements give visual feedback quicker, but it doesn't actually seem to get the pages up faster. I switched back to firefox after a while.

Screw YOU! (-1)

Anonymous Coward | about 2 years ago | (#42102425)

IE 8 works perfectly fine and this is just a ploy by Microsoft to steal our hard earned mone from a platform that has worked fine for years! It costs money businesses to upgrade and NO I wont have any change. What I have works fine and I WONT change. I am in my late 30s to early 40s and I know better and this potent knoweldge tells me the PHBs are right and IE 8 is best. Why spend money in a cost center ... ... oh I thought we were talking about Windows 7. Nevermind. Oh yea Firefox ROCKS@@@

So far (2)

ArchieBunker (132337) | about 2 years ago | (#42102433)

None of these improvements feel any faster. Pages still load as quickly as they did a decade ago (provided your connection was fast). Why can't they make anything render faster?

Re:So far (3, Interesting)

Billly Gates (198444) | about 2 years ago | (#42102449)

None of these improvements feel any faster. Pages still load as quickly as they did a decade ago (provided your connection was fast). Why can't they make anything render faster?

Have you used Firefox 3.6 recently? It sucks very badly which is why myself and Hairyfeet been promoting Chrome for 2 years. Run it on a VM and IE 9 is many multitudes a better browser. 10 years ago IE 6 broke all records in javascript performance. Run that today and slashdot and its default MSN homepage will crash within 20 seconds as the javascript interpretter can only run in 20 megs or ram and will crash.

Old macs at my employer in the breakroom running Safari 1.x from 2006 is simply not even usable as yahoo.com takes 5 minutes to load.

Re:So far (4, Insightful)

epyT-R (613989) | about 2 years ago | (#42102473)

javascript was never meant to do the things it's being used for now, that's why sites are so damned slow now.

Re:So far (1)

Billly Gates (198444) | about 2 years ago | (#42102485)

Ie 6 as much as we hate it defined WEB 2.0 with inventing AJAX. Right now it is a platform whether we like it or not which is why old browsers crash on news websites. It is the framworks that are loaded taking gigs of ram as the browser is the OS.

Firefox 18 and Chrome can make it run decent. We need a another language in the browser I do agree but with smart phones and applets running we need a platform as they simply use the HTML 5 browser for their UI and do ajax for the logic.

Re:So far (1)

epyT-R (613989) | about 2 years ago | (#42102573)

Yeah I know, I just don't like it. 95% of the security problems we have start with the scriptable browser. If you want an interactive application that talks to remote data sources, make one and distribute it. It's not like the current stack's attempts at virtualization/abstraction have proven any more secure. It just makes remote access applications slower and take 10x the resources it should.

Re:So far (3, Insightful)

Darinbob (1142669) | about 2 years ago | (#42102797)

Or we just turn back to clock to the good old days where web sites were about presenting information simply with a simple markup language instead of trying to be a full application.

Re:So far (4, Insightful)

NightHwk1 (172799) | about 2 years ago | (#42102551)

What sites are so damned slow? It's not the Javascript in most cases, it's the asset loading. The tubes are still the bottleneck on the web.

If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now, and there are more and more sites that are mostly client-side, using frameworks like Backbone.

Re:So far (4, Informative)

tlhIngan (30335) | about 2 years ago | (#42102739)

What sites are so damned slow? It's not the Javascript in most cases, it's the asset loading. The tubes are still the bottleneck on the web.

If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now, and there are more and more sites that are mostly client-side, using frameworks like Backbone.

Well, let's see. Go to a site like Engadget and make sure your javascript allow settings are set to show the page comments. Now open 24 hour's worth of news postings in tabs. After about halfway through, your browser will start to lag with CPU pegged. It's not loading content off the Internet if your CPU is pegged. But it's just all the instances.

Or you can replicate it with a busy thread like one of their "post a comment to enter in a draw), where it gets around 200 comments a minute. You'll find the browser noticably slows down as the javascript is constantly running.

Repeat same with a site like Gizmodo (probably an even worse offender). I can't open more than 10 tabs at a time before the browser locks up for a few minutes at 100% CPU.

Latest firefox and Chrome. Websites just gobble up the Javascript processing. Those sites are unusable on weaker javascript engines.

Re:So far (1)

Billly Gates (198444) | about 2 years ago | (#42102939)

Shit if you have IE 6 just open it in a VM and the default MSN aka nbcnews.com will freeze within 20 seconds guaranteed! It is the ad network's javascript with "Add this to your facebook when you LIKE IT!" that is designed for more modern browsers.

IE 6 takes about a full minute to render on my FIOS on a modern system before getting an error message dialog box asking if I want to report this crash to Microsoft.

I give credit Firefox 3.x and Chrome have upped their gain. Even the the retarded stepchild IE 8 can at least load but IE 9 much more closer to a modern browser with this same script.Google maps is a popular stress test. I wont even bother to load it undre IE 6 hahaha. Under IE 8 it slowly and I mean slowly works if you have patience compared to IE 9, Chrome and Firefox. IE 10 should be out soon which is somewhat modern but just like the CPU wars in our past as soon as a great CPU and ram become available a race to use inquires making your once fast 386 obsolete FAST. Same is true with browsers.

Re:So far (4, Informative)

wmac1 (2478314) | about 2 years ago | (#42102975)

Engadget has heavily linked in components from other websites in the form of inline-frames , advertisement, images, tracking etc. They have also heavily used Javascripts, transparent layers etc. One would think they purposefully applied "worst practices".

Having either DNT+ or AdBlock (with privacy filters) will stop the commenting system altogether.

Re:So far (1)

Jah-Wren Ryel (80510) | about 2 years ago | (#42104045)

Having either DNT+ or AdBlock (with privacy filters) will stop the commenting system altogether.

Which is soooo ironic. If you are blocking their ads, the only way you can help them is to contribute to the community so that more people without ad-blockers will spend time loading pages with ads. Plus, it is reasonable to assume that people blocking ads are smarter than your average dog on the internet so their comments might be higher calibre than the hoi polloi.

Re:So far (0)

Anonymous Coward | about 2 years ago | (#42103251)

Also, for all the talk of optimization, libraries (e.g., jQuery), are -significantly- slower than native JavaScript. Check out jsPerf.com and see for yourself.

Re:So far (1)

haruchai (17472) | about 2 years ago | (#42103717)

What are your system specs?
  I've been hearing complaints from various Slashdotters for a while but can't replicate their issues.
I usually run the PortableApps FF package with TreeStyleTabs and a few other add-ons but it's only with this latest FF17 that it's been at all crashy - I've had a few in the last week when I get above 10 windows & 70 tabs.

I've routinely exceeded 15-20 windows / 80 - 175 tabs for a couple years with as many as 5 simultaneous http downloads, 3 - 5 YouTube streams, Facebook, Slashdot, etc all at the same time.

Re:So far (4, Interesting)

wvmarle (1070040) | about 2 years ago | (#42103053)

Slashdot is a prime example of a site heavily using javascript.

Ubuntu 10.04 LTS stuck to Firefox 3.6 for a long time. When loading a /. page, particularly one with many comments, it often gave me the "script is taking too long to complete" warning message. It would eventually complete, but took long. When Ubuntu finally replaced the browser with a newer Firefox, that problem was solved. It now renders reasonably fast.

And considering I have ads disabled, it is really /. itself that's so demanding.

Re:So far (0)

Anonymous Coward | about 2 years ago | (#42104009)

When Ubuntu finally replaced the browser with a newer Firefox, that problem was solved.

The problem was solved long before (for those with more than one braincell) when they added the firefox current version ppa to 10.4

Re:So far (1)

yahwotqa (817672) | about 2 years ago | (#42104067)

Nope, no javascript on my slashdot. :) Using the older interface also has the benefit of preventing accidental moderation (you still have to click the Moderate button).

Re:So far (0)

Anonymous Coward | about 2 years ago | (#42103099)

It's the Javascript. I routinely use a three year old browser. Many sites freeze the browser for several seconds after each page load and sometimes after every interaction. Many sites don't work right at all. IMHO there must be some "we don't care how long this takes on older browsers, as long as the result is the same as on current browsers" kind of workaround in a major Javascript library which triggers the stall. Old browsers which were the herald of standards compliance just a few years ago are basically unusable on the modern web.

Re:So far (1)

serviscope_minor (664417) | about 2 years ago | (#42103407)

If anything, Javascript is speeding things up. AJAX content updates without a full page refresh are commonplace now,

Yeah, that was always the idea. And sure, the AJAX reload on full gmail is faster than a full refresh. That's not surprising becahse the code required to run the whole program is quite large. However on really slow connections, the old fallback HTML interface works faster, and gmail automatically switches.

Of course it means you can have massive pages without insane repeated load times, but small HTML only pages often run faster even if the UI presented is not as good.

Another good example is theregister. Very simple mostly static pages (works fine with JS disabled) and blindingly fast load times. It makes it very pleasant to read news there.

Re:So far (0)

Anonymous Coward | about 2 years ago | (#42102943)

Insightfull my ass. Javascript is a generic purpose language now, and has been redesigned, heavily optimized and turned upside down to allow for the things it's being used now.

This is not the Javascript of your grandfathers, and it's not responsible for the huge page sizes and the low latency and bandwidth of your ISP.

Re:So far (-1)

Anonymous Coward | about 2 years ago | (#42102969)

http://www.sports-asics.com/

Re:So far (1)

Anonymous Coward | about 2 years ago | (#42103065)

Disabling javascript, except in cases where it's really needed (or useful), can make quite a difference. Most sites work fine and load quicker as a result.

Re:So far (1)

Anonymous Coward | about 2 years ago | (#42102589)

Well duh. We just upgraded to FF 17. FF 3.6 must have been 12 years ago. Of course it's slow today (or even two years ago when FF 15 must have been the norm).

Re:So far (0)

PNutts (199112) | about 2 years ago | (#42102847)

Well duh. We just upgraded to FF 17. FF 3.6 must have been 12 years ago. Of course it's slow today (or even two years ago when FF 15 must have been the norm).

IIRC FF 3.6 was about 9 months ago. Ba-dum.

Re:So far (2)

Billly Gates (198444) | about 2 years ago | (#42102945)

FF 4 came out in March 2011. I remember the day vividly. That is 18 months old. 12 years ago we were waiting for the all so aweome IE 6 with the correct box model ohh and ahh.

That might not seem a long time ago to many here but the way the browser wars 2.0 are heating up it is becoming the next IE 6 of FF FAST! A world of difference in just a year and a half if you benchmarked both.

Re:So far (0)

Anonymous Coward | about 2 years ago | (#42102625)

which is why myself and Hairyfeet

I know how you get hairy palms. But how the hell do you get hairy feed? (shudders)

Re:So far (1)

afaik_ianal (918433) | about 2 years ago | (#42102677)

Licking his palms.

Re:So far (0)

Anonymous Coward | about 2 years ago | (#42103395)

i'm using firefox 3.5 right now like i always do, slashdot works fine.

Re:So far (1, Insightful)

Cinder6 (894572) | about 2 years ago | (#42102457)

I find this an odd comment. I definitely notice pages rendering faster, and I can see this effect simply by changing browsers. Some are genuinely faster than others all around, while others have different rules for when they begin to display content.

With that said, I'm getting kind of tired of all the Firefox posts. Why do we get a post for seemingly every Firefox release, including betas, and no mention at all of Chrome updates? (Note: I'm not advocating for more "stories" about Chrome.) Maybe everyone's still in the mindset of Firefox getting an update only twice a year (or maybe this site's Firefox usage justifies it). Yeah yeah, I know, "don't read it if you don't like it" and all, but it's still a bit perplexing.

Re:So far (1)

Anonymous Coward | about 2 years ago | (#42102717)

Firefox is open about their updates and this is significant - a whole new JIT compiler.

This is an area that has been sorely needing update, people were complaining about the speed.
So notifying them that a significant speed bump occurs in the new version is HELPFUL INFORMATION.

Google manages their updates differently and they do usually get articles when significant milestones are reached.
If they re-did their JIT compiling and it resulted in a speed boost, you don't think you'd hear about it?
Maybe the problem is really just a lack of attention paid?

Re:So far (-1)

Anonymous Coward | about 2 years ago | (#42102463)

why are you such a faggot?

Re:So far (0)

epyT-R (613989) | about 2 years ago | (#42102475)

the same reason you are I guess..genetic luck of the draw..

Re:So far (0)

Andy Prough (2730467) | about 2 years ago | (#42102553)

WTF does FF18 have to do with Welsh pork liver patties? Idiot.

Re:So far (-1)

Anonymous Coward | about 2 years ago | (#42102861)

That's not a very good riddle. Try this:
Why did the firefox have faggots and peas for dinner?
.
.
.
Because it was Friday.

Re:So far (1)

Jonah Hex (651948) | about 2 years ago | (#42102483)

I'm still going through loading my tabs, and initial page load seems slower but of course it could just be my connection acting up. Subjective experience means much to the individual user but I can't tell yet that anything is really faster. *shrug* - HEX

Re:So far (3, Insightful)

DNS-and-BIND (461968) | about 2 years ago | (#42102893)

Because all the speed improvements were used by developers not to give the user a better experience, but to develop ever-more-complex pages. "Look, this new browser is 50% faster. Now, we can make a super-complex web page and still get the old speed!" Repeat for every speed increase.

What's really going to happen: (1)

Let's All Be Chinese (2654985) | about 2 years ago | (#42103335)

The webmonkeys get hold of it. Do everything with it. They're ecstatic! Finally something that runs their javascript nice and fast!

So they throw more js into their webpages. Drop in a few more libraries, for their convenience. Of course, they're testing the stuff to the dev server that's at least as fast as the production server but sees only a small fraction of the load, and they have gigabit from desktop to server.

Thus, their websites become that much more crappy for everyone else, for everyone who doesn't have the lastest accellerator, or a nice and fast connection to an overspecced and mostly idle server.

It's happened before, it's happened again. Feh, if your desktop is old enough (single core, less than 2GHz these days) then between the crashes due to low memory you can actually notice when, say, jquery gets an update: Everything that uses it gets slower.

This is the state of websites, and as things stand, faster browsers mean slower websites for non-webmonkeys.

Re:So far (1)

LordLimecat (1103839) | about 2 years ago | (#42104203)

Because by and large web developers hate you and see it as their duty to load your CPU as much as they can.

You want someone to blame, blame the folks who see it as their duty to match JS engine improvements with more crap on the webpage.

With this frequency, I can't keep up! (-1, Flamebait)

bogaboga (793279) | about 2 years ago | (#42102497)

With Mozilla's frequency of spitting out new and newer versions, I just can't keep up!

One wonders why the folks at Mozilla won't spruce up what is currently available by for instance adding enterprise management features instead of upping versions as if it's their [only] call.

Why? If I may ask?

Re:With this frequency, I can't keep up! (2)

Tarmas (954439) | about 2 years ago | (#42102525)

With Mozilla's frequency of spitting out new and newer versions, I just can't keep up!

No problem, just turn automatic updates on.

Re:With this frequency, I can't keep up! (0)

Anonymous Coward | about 2 years ago | (#42103217)

No problem, just turn automatic updates on.

Only works on systems where the user runs as admin / root. You don't do that, do you?

Re:With this frequency, I can't keep up! (0)

Anonymous Coward | about 2 years ago | (#42102779)

Maybe you're on the wrong release channel.

Oh, and because that's what the devs enjoy more.

Still slower than v8 (1)

Anonymous Coward | about 2 years ago | (#42102501)

I found it remarkable that the benchmarks only compared to earlier versions of the Firefox JavaScript implementation. A comparison with JavaScriptCore and v8 can be found at http://arewefastyet.com [arewefastyet.com]

Re:Still slower than v8 (2)

Fnkmaster (89084) | about 2 years ago | (#42102833)

But only marginally so. In most benchmarks, it looks to be roughly on par with v8 now. In some, what, 20% slower? I think that's pretty respectable - compare with the situation a year or two ago when Firefox's Javascript engine was several times slower than v8.

Re:Still slower than v8 (1)

Anonymous Coward | about 2 years ago | (#42102935)

But only marginally so. In most benchmarks, it looks to be roughly on par with v8 now. In some, what, 20% slower? I think that's pretty respectable - compare with the situation a year or two ago when Firefox's Javascript engine was several times slower than v8.

So the browser written by Google is 20% better on the benchmark written by Google, but there's only much less difference on other benchmarks?
Could that difference be related to the fact that the winning browser and the benchmark are written by the same company?
And note that this doesn't need to be intentional manipulation: It may just be that Google's benchmark is the primary benchmark Google's JavaScript engine optimizations are tested against.

flash still crashes firefox regularly on win 2012 (0)

Anonymous Coward | about 2 years ago | (#42102523)

still causing slowdown and frustration regardless of the speed in the javascript engine.

Déjà Vu (0)

Anonymous Coward | about 2 years ago | (#42102543)

Seems like every other browser release advertises massive javascript performance boosts via some new engine.

IonMonkey, JagerMonkey, TraceMonkey, SpiderMonkey (5, Interesting)

file_reaper (1290016) | about 2 years ago | (#42102557)

I haven't kept track with the JIT's that have been in Firefox. I recall the days when TraceMonkey and JagerMonkey were added to boost performance. Could somebody recap or tell why Firefox is abandoning the older versions or redoing them? I'm truly curious as to what they learned, what worked and what didn't work. Are they finding new usage patterns that warrant a new JIT design? Thanks.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (1)

MSG (12810) | about 2 years ago | (#42102913)

They aren't being replaced. Each of these codenames is an additional optimization layer. The performance enhancements are cumulative.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (3, Informative)

BZ (40346) | about 2 years ago | (#42103087)

That's not quite true.

TraceMonkey has in fact been removed, and JaegerMonkey may be too once the new baseline JIT being worked on now is done.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (5, Informative)

Turnerj (2478588) | about 2 years ago | (#42102931)

Wikipedia [wikipedia.org] goes into a bit of detail about it but in basic summary...

TraceMonkey was the first JIT compiler for SpiderMonkey released in Firefox 3.5.

JagerMonkey is a different design on TraceMonkey which outperforms it in certain circumstances (Some differences between TraceMonkey and JagerMonkey [mozilla.org] )

IonMonkey is another attempt at better perfecting the idea of JagerMonkey allowing even greater optimisations under particular circumstances.

However TraceMonkey, JagerMonkey and IonMonkey are part of SpiderMonkey as JIT compilers, not a replacement of SpiderMonkey itself.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (0)

Anonymous Coward | about 2 years ago | (#42102999)

Mozilla had the first JavaScript engine (SpiderMonkey) and the first JavaScript JIT (TraceMonkey), so it's not surprising that they've had more changes. Their development process is also much more transparent than that of other vendors, so their codenames get more visibility.

Bear in mind that Webkit's JavaScriptCore has had SquirrelFish and SquirrelFish Extreme JITs, Opera has had Futhark and Carakan, and even relative newcomer V8 has had a new Crankshaft JIT added. Mozilla is by no means the odd one out, optimising JavaScript is still a relatively young field and people are still working out the best way to do it.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (1)

Anonymous Coward | about 2 years ago | (#42103001)

The big difference with IonMonkey is that it adds an IR (intermediate representation) stage. That allows for much better and more modular optimizations at the cost of making compilation take significantly longer. The idea is that the JägerMonkey JIT has faster start-up time and will be used for not-as-long-running code while IonMonkey will be used to more heavily optimize very long running code.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (0)

Anonymous Coward | about 2 years ago | (#42103047)

Because SpiderMonkey had too much JagerMonkey and ran out of IonMonkey so couldn't finish TraceMonkey?

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (5, Informative)

BZ (40346) | about 2 years ago | (#42103135)

A short summary:

1) TraceMonkey turned out to have very uneven performance. This was partly because it type-specialized very aggressively, and partly because it didn't deal well with very branchy code due to trace-tree explosion. As a result, when it was good it was really good (for back then), but when it hit a case it didn't handle well it was awful. JaegerMonkey was added as a way to address these shortcomings by having a baseline compiler that handled most cases, reserving tracing for very hot type-specialized codepaths.

2) As work on JaegerMonkey progressed and as Brian Hackett's type inference system was being put in place, it turned out that JaegerMonkey + type inference could give performance similar to TraceMonkey, with somewhat less complexity than supporting both compilers on top of type inference. So when TI was enabled, TraceMonkey was switched off, and later removed from the tree. But keep in mind that JaegerMonkey was designed to be a baseline JIT: run fast, compile everything, no fancy optimizations.

3) IonMonkey exists to handle the cases TraceMonkey used to do well. It has a much slower compilation pass than JaegerMonkey, because it does more involved optimizations. So most code gets compiled with JaegerMonkey, and then particularly hot code is compiled with IonMonkey.

This is a common design for JIT systems, actually: a faster JIT that produces slower code and a slower JIT that produces faster code for the cases where it matters.

https://blog.mozilla.org/dmandelin/2011/04/22/mozilla-javascript-2011/ [mozilla.org] has a bit of discussion about some of this.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (0)

dotancohen (1015143) | about 2 years ago | (#42103857)

Thank you for the comparison. Why can't web developers compile the javascript and provide that? I do understand that each runtime (browser) is unique, but why not have something along the lines of:

<script type="text/javascript" name="fooBar" compiled-for="firefox" src="firefox.js"></script>
<script type="text/javascript" name="fooBar" compiled-for="chrome" src="chrome.js"></script>
<script type="text/javascript" name="fooBar">
        fooBar();
</script>

Thus the appropriate compiled code is presented to each runtime, and if there is no compiled code available for any particular runtime then the uncompiled code can be used. This is similar to how software is currently made available: binaries for the common platforms and source for the rest.

Of course I realize that MSN.com will have available only compiled code for IE, thus ostensibly 'killing' Firefox and Chrome performance. In fact, Firefox and Chrome performance will remain as it is, simple IE performance will be improved.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (1)

dotancohen (1015143) | about 2 years ago | (#42103863)

In retrospect, "text/javascript" for the first two items should be "bin/javascript".

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (0)

Anonymous Coward | about 2 years ago | (#42103917)

Web developers asking the browser to run binary code straight from the server surely sounds like a refreshing idea. I wonder if anyone have thought about it before...

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (1)

joaosantos (1519241) | about 2 years ago | (#42103931)

I would prefer a more efficient lower level intermediate language, common to every browser that could be targeted by more languages.

Re:IonMonkey, JagerMonkey, TraceMonkey, SpiderMonk (1)

shipofgold (911683) | about 2 years ago | (#42104065)

I can't wait to see what kind of malware this type of scheme would produce. I have no proof, but if feels like running a compiled bytecode would be easier to exploit than text based javascript.

JS Speed is the deciding factor in modern webpages (5, Insightful)

detain (687995) | about 2 years ago | (#42102607)

Its good to see the focus of this release being an attempt to increase javascript speed by leaps and bounds. Modern webpages often use JS that goes way beyond anything people did 10 years ago (Jquery for example) and the complexities of what people do with javascript noticably slow down most webpages considerably.

Re:JS Speed is the deciding factor in modern webpa (3, Interesting)

VortexCortex (1117377) | about 2 years ago | (#42102683)

Its good to see the focus of this release being an attempt to increase javascript speed by leaps and bounds. Modern webpages often use JS that goes way beyond anything people did 10 years ago (Jquery for example) and the complexities of what people do with javascript noticably slow down most webpages considerably.

When I first learned to program in BASIC, I used to think that people should try speeding up C and Assembly language -- Make EVERYTHING run faster... Then I learned C and x86 Assembly and I realized, you can't speed up assembly language -- It's a perfectly optimized language, there's nothing under the hood to tweak. You might select a better algorithm, or better use registers, but this isn't changing ASM. C can't really be hugely optimized either, it's pretty close to the metal, but there there are a few things one can do to increase performance in the space of its minimal abstractions; fewer with a mature compiler on mature hardware/platform...

I always wondered what the deal was with JavaScript too, "Wow, it's getting faster, AGAIN?" Then I created my own languages and compilers and I learned: A sign of a horribly designed language is that the speed of its implementations can be repeatedly increased "by leaps and bounds"...

Re:JS Speed is the deciding factor in modern webpa (1)

dmomo (256005) | about 2 years ago | (#42103111)

I'm not so sure that the Javascript (well, EMCA Script) LANGUAGE is the problem. The challenges with respect to rendering speed have more to do with the DOM and the interaction with the browser itself. The DOM is a bulky beast. When javascript listeners are assigned to page elements the code can in turn alter the DOM creating or destroying elements, all of which can trigger javascript functions, any of which can create or destroy more DOM elements. It's a properly tangled mess. Memory management in this environment is no small task.

Re:JS Speed is the deciding factor in modern webpa (4, Insightful)

madsdyd (228464) | about 2 years ago | (#42103225)

I don't understand why this comment got +5. It is pretty misguided.

The statement:

> I realized, you can't speed up assembly language -- It's a perfectly optimized language, there's nothing under the hood to tweak

makes some limited sense in some contexts (one could argue that the microcode supporting the assembler on the CPU is repeatedly optimized), but none in this. The IonMonkey JIT does essentially optimize the assembler code[*], by rearranging it in various ways to make it faster. E.g. it takes stuff like this (in javascript, as I have not written assembler in years):

for ( var i = 0; i != 10 ; ++ i ) {
    var foo = "bar";
}

and changes it to e.g. this:

for ( var i = 0; i != 10; ++i ) {
}
var foo = "bar";

possibly then this:

var foo = "bar";

This is an optimization and it is performed at assembler level (Again: the above is not meant to be read as JavaScript, but assembler).

The other statement that really sticks out is this:

> A sign of a horribly designed language is that the speed of its implementations can be repeatedly increased "by leaps and bounds"...

This simply highlights that the poster really do not understand the goals behind crossplatform languages, such as Java, Dalvik, JavaScript, lisp, ML, Python, Perl, and so on, or the goals for weakly typed languages.

[*] It works on an abstract representation of the assembler code, but it might as well have been working directly on the assembler, was it not for the fact that this would require it to learn to many assembler variants.

Re:JS Speed is the deciding factor in modern webpa (2)

chris.alex.thomas (1718644) | about 2 years ago | (#42103411)

how is it horribly misguided, when you're one example proves his point.

that you cannot optimise the ASM layer, cause it's already directly on the cpu, but you can optimise the algorithm.

your example did just that and this optimisation is done FAR FAR above the assembly level.

Re:JS Speed is the deciding factor in modern webpa (1)

madsdyd (228464) | about 2 years ago | (#42103505)

I invite you to go read http://en.wikipedia.org/wiki/Program_optimization#.22Levels.22_of_optimization

And perhaps http://en.wikipedia.org/wiki/Tracing_just-in-time_compilation

As I said: The statement might make limited sense in some contexts, but not in this.

Re:JS Speed is the deciding factor in modern webpa (0)

Anonymous Coward | about 2 years ago | (#42103247)

Good luck programming web content with C and ASM...

Re:JS Speed is the deciding factor in modern webpa (0)

Anonymous Coward | about 2 years ago | (#42103535)

A sign of a horribly designed language is that the speed of its implementations can be repeatedly increased

The inability to optimize code is the sign of a horrible language, check out the large number of gcc optimization flags. Even then C Optimization hits some problems, pointer aliasing, one step compilation necessary for whole program optimization and close to the metal means more "do it this way" instead "this is what i want" and every developer should know that "this is what i want" gives more freedom to write a good implementation - asm is an extreme of "do it this way", optimization almost impossible. It has been a long time since humans where better at (large scale) low level optimization than compilers.

Re:JS Speed is the deciding factor in modern webpa (0)

Anonymous Coward | about 2 years ago | (#42103653)

Note that JavaScript is not a compiled, but an interpreted/JIT-compiled language. That is, the execution speed not only includes the execution speed of generated code, as in the case of compiled languages, but also the time of compilation/interpretation. That is, to have a meaningful comparison, you would have to compare to the time it takes to compile and run your C program.

Of course you could question the very concept of sending source code to the client, instead of sending something compiled to bytecode. But that's a completely different question; it has to do with how JavaScript is deployed, not how the language is defined. In principle it would be no problem to define bytecode for JavaScript, implement it in the browsers, compile the JavaScript to bytecode before putting it on the server, and send the bytecode to the browser. Of course if only one browser vendor did that, it would not be very useful. Even worse, if every browser vendor defined a different bytecode. So such a feature would only make sense if it were standardized.

Re:JS Speed is the deciding factor in modern webpa (0)

Anonymous Coward | about 2 years ago | (#42103711)

You have this mostly backwards: being able to increase the speed of implementations is actually a good thing. Consider:
* x86 assembly implementation *has* increased by leaps and bounds, thanks to Intel and AMD. And that's a very good thing (which isn't only down to process technology).
* HTML video performance has increased leaps and bounds (mainly by offloading to GPU). Your hand-written x86-assembly video streaming code just won't perform on a mobile device. (I'm not sure your C program's layout code would always outperform webkit's general-purpose engine either, leaving aside the advantages of CSS/HTML over writing everything in C.)
* Your hand-written C parser probably parses slower than Perl does (you don't have the time to optimise your C code for that one-off sysadmin task).
* A SQL query stands a pretty good chance of performing better nowadays than writing your own storage implementation in C.

Re:JS Speed is the deciding factor in modern webpa (2)

serviscope_minor (664417) | about 2 years ago | (#42103911)

Then I learned C and x86 Assembly and I realized, you can't speed up assembly language -- It's a perfectly optimized language, there's nothing under the hood to tweak.

Well, not really. You can't optimize perfectly optimized assembly any further, but that's just tautology. You can optimize even quite well written assembly further, especially on modern CPUs with various selections of functional units, variable pipelines, multiple instruction dispatch per cycle, vectorising etc.

In fact, the days have generally passed where I can write better ASM that what gcc can output from a C/C++/Fortran program, because it has much better knowledge of the CPU internals and is generally better at doing things like register allocation etc.

In terms of Fortran, C99 and GNU's extensions to C++, it can even be told when pointers don't alias, so one of the final benefits of assembly has vanished. With good SSE scheduling and support from within higher level languages, the other main benefit of assembler has vanished.

It's now at the point where C++ is generally faster than assembler for all but the best of assembler programs and some very specific cases, for instance where it helps to have direct access to something like the CPU flags.

One reason is that the optimizers have got really, really good. They can even figure out when you're stepping through a 2D array in the wrong order can re order the loops. They are also excellent at removing redundancy which means you can write simpler code (and therefore write more advanced algorithms much more easily) without worrying about redundancy killing performance.

The optimizers are amazing and do an awful lot. This is why optimized code is more or less impossible to debug. Once it does passes of inline, partial redundancy emimination, loop unswitching, fusion, strip mining, dead code elimination, constant propagation, alias analysis, loop unrolling, modulo scheduling and register allocation, a given instruction probably doesn't correspond to any one line of code at all, so no stack trace is even possible.

And as for assembly, things aren't so simple there. Look at the difference between the faster clocking P4 and the much faster performing Ivy Bridge i7. The result of that is essentially that the i7 has an optimizer inside it which runs at 3.3 GHz and performs many of those steps real-time before actually farming out instructions for computation. IPC is all about optimizing assembler as much as possible.

Then I created my own languages and compilers and I learned: A sign of a horribly designed language is that the speed of its implementations can be repeatedly increased "by leaps and bounds"

It depends on what you mean "horribly designed". Many languages are designed to be easy for some particular task rather than maximal performance. It's hard to argue that Haskell is poorly designed, but optimization is a very tricky problem and performance has certainly increased greatly as optimization techniques have become well understood.

Even FORTRAN, which was specifically designed to be optimizable even given the non-existant state of the art at the time (optimizing compilers were essentially invented for FORTRAN) has seen significant improvements in the performance of the compilers.

C and C++ which were designed to be close to the metal and so not need much optimizing have also got much better with improved optimizers.

It seems that the key to a high performing language is to allow the user to tell the computer as much as possible about the program's intent, so that things can be proven and optimization can be performed.

We can haz hardware acceleration? (0)

Anonymous Coward | about 2 years ago | (#42102619)

Can we get OpenGL based hardware accelerated rendering already?
Things are really, really, slow, and every single other browser on my system out performs firefox by a factor of 30 MINIMUM.
It's almost insufferable, scrolling's jerky and interactive graphs, like those on github, update less than once a second (and completely max out a CPU core whilst doing so).

Re:We can haz hardware acceleration? (0)

Anonymous Coward | about 2 years ago | (#42102711)

Maybe you should get rid of your 10 year old computer.

You already have it !? (1)

tuppe666 (904118) | about 2 years ago | (#42104041)

Can we get OpenGL based hardware accelerated rendering already?

You should already have it :)

Tools > Preferences > Advanced > General > Browsing: "Use hardware acceleration when available"

I can already notice a big improvement .. (1)

dextermanas (1534573) | about 2 years ago | (#42102699)

/. is now a little bit more bearable.

http://tumenggung666.heck.in/ (-1)

Anonymous Coward | about 2 years ago | (#42102715)

http://tumenggung666.heck.in/1-pesan-baru.xhtml

They mean its runs on some macs (1)

thogard (43403) | about 2 years ago | (#42102767)

It won't run on about 55% of the macs out there.

Re:They mean its runs on some macs (0)

Anonymous Coward | about 2 years ago | (#42103183)

why? You can't just post only that...

Re:They mean its runs on some macs (1)

wisty (1335733) | about 2 years ago | (#42103277)

I guess you are referring to the Mac OS X 10.5 requirement, or the need for x86?

Roughly 10% of Macs run Tiger, or a previous OS. Leopard has ~20%, SL 50%, and the Lions 20% (rough figures). Many of those Leopards are PPC. I'd guess at most 30% of Macs (which are actually in use - not old TAMs or Classics, if there's even a significant number of those) can't run Firefox.

Re:They mean its runs on some macs (1)

thogard (43403) | about 2 years ago | (#42103377)

Apple has reported that 55% of the people who can run Lion are not running Lion. For most users, there is no reason to upgrade between major versions of OS X and since the change between 10.5.8 and 10.6.3 is about the change between the first 50 patches to service pack 2 for XP, I find it odd that a version isn't built since all it should take is not using xcodes default settings.

Re:They mean its runs on some macs (0)

Anonymous Coward | about 2 years ago | (#42103287)

It won't run on about 55% of the macs out there.

Go and use Safari and stop whining. You wanted the Apple Experience.

china wholesale (-1)

Anonymous Coward | about 2 years ago | (#42102977)

China Electronics wholesale [igiftswholesale.com] - Cheap Electronics Gadgets wholesale - Discount Consumer Electronics Distributors
If you want to buy electronics from China, fordexgroup will be your best choice. The lowest price because of free shipping, the most products such as Apple accessories [igiftswholesale.com] , Cameras, Cell Phones, Video Games Accessories,etc.
China Wholesale Electronics and Dropship: Android Tablets [igiftswholesale.com] , Android Phones, Cheap Mobile Phones, Electronics Gadgets [igiftswholesale.com] , Car DVD Players, GPS Devices, Portable DVD Players, Car Video, Car Accessories [igiftswholesale.com] , MP3/MP4 Players, DigitalCameras, Computer Accessories [igiftswholesale.com] , Digital Picture Frames, IP Cameras, and Surveillance + Security Cameras - Chinese Dropshipping Worldwide, Direct From China.

Firefox: software of choice for Casey Anthony (1)

asjk (569258) | about 2 years ago | (#42103233)

because sometimes law enforcement doesn't know there are other browsers.

When I just want the data..not the dressing (1)

fred911 (83970) | about 2 years ago | (#42103275)

lynx renders..

ma8e (-1)

Anonymous Coward | about 2 years ago | (#42103359)

thPat sord3d, [goat.cx]

Breaks on github (1)

petsounds (593538) | about 2 years ago | (#42103621)

Actually, I shouldn't say that. Firefox started breaking on github around version 17.0. Many of the sub-project pages, e.g. Issues page and the Markdown - Raw viewer, redirect to a "Page did not load in a timely fashion" error page. This happens consistently on every github project. Unless the github team has done something weird on their end, this is another in the lengthy amount of compatibility problems Firefox is beginning to have.

multiple threads? (1)

StripedCow (776465) | about 2 years ago | (#42103767)

What I liked about the previous mozilla javascript engines was that they supported multithreading. That made them suitable for web-server use. In contrast with, for example Chrome's V8, which is not suitable for server use (unless you are prepared to spawn multiple processes instead of threads, but this is very expensive performance-wise of course).

So I hope they support multithreading.

oh, yet another javascript improvement (1)

allo (1728082) | about 2 years ago | (#42103901)

its not like javascript were the problem of any current browser, but all of them work on improving the js engines instead of taking a break from the x-th js improvement and build a better browser ui. For example mozilla should fix stuff like menubar, favicon, statusbar, ... all the "we need to look like chrome" stuff. hey, if i want something which looks like chrome, i use chrome. If there weren't all the extension fixing this crap, maybe i would be a chrome user for a long time, since firefox gave up its own identity and started to clone chrome.

Too little, too late (1)

Kergan (780543) | about 2 years ago | (#42103939)

People don't change browsers much if those I know are any indicator. When they do, it typically is because one or more of three events occurred. The first is when they're actively shown an alternative by a preacher. The next is when they compare a site in different browsers and notice a material difference, eg when designing it. The last is when intense frustration leads them to actively seek alternatives.

In my (non-representative) sample, FF has been hemorrhaging irate users for several years now. And the list of thinga that irritated them seems unending. So this feels like too little, too late.

Re:Too little, too late (1)

davids-world.com (551216) | about 2 years ago | (#42103983)

I tried switching from Chrome (using the Canary build), because it kept crashing on me. I had to use Firefox nightly builds, because the standard Firefox looks fuzzy on my Retina display Macbook Pro (high-res Retina display have been out for almost 6 months, IIRC).

Unfortunately, Firefox turned out to have problems with providing a cursor in the location bar after opening a new window (you had to set it by mouse!), and the privacy mode is broken: it removes all other windows and switches to anonymous browsing globally. Autofill did not work as well as I expected: user names often weren't filled in. Speed was not an issue.

Enough little issues for me to switch back to Chrome. It's sad that they haven't managed to make FF fully useable. To make anonymous browsing apply to a single window appears to require deeper architectural changes according to the bug thread, which does not bode well for their overall design. Making it per-tab appears to be yet another story, which seems even more strange to me.

Holy Shit kill Java script and burn the blueprints (1)

gelfling (6534) | about 2 years ago | (#42104209)

For fuck's sake - 5 layers deep of scripts? Six? More? No Script has become nearly useless when I have to turn on 5 or 6 LAYERS of scripts and 45 different scripts just to format a page. And on a good day they slow everything down to level of running it on a 486DX100 machine circa 1996.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?