The job of testing and troubleshooting applications is tougher than it has
ever been. At least, that's what Ran Gishri, director of global marketing at
, will tell you.
As the man in charge of BMC's AppSight product line -- recently renamed BMC
Application Problem Resolution System (APRS) -- Gishri often sees large development
shops struggle with increasingly complex and changeable business and technology
"Many of the problems are just due to change. I get the sense from 60,000
feet that everything is changing," Gishri said. "More and more companies
are releasing more often. I think with all the agile methods and pressure to
build more and get it out sooner and make it high-quality...all that change
is really, really killing applications."
Gishri should know. BMC
Application Problem Resolution System 7.0 is designed to help dev teams
gather, assess and analyze information related to application troubleshooting.
By automating many of the labor-intensive activities around these tasks, APRS
7.0 aims to drive down resolution times and improve application quality.
The previous version of APRS, known as AppSight 6.0, was available in distinct
Windows and Java-based versions. The separate versions made it difficult to
sleuth issues that occurred with software processes crossing platform lines.
"Most large enterprises have mixed applications, a mix of Java and .NET,"
Gishri said. "If you are a tester, operating two different consoles, it's
very complex. It just didn't work. It didn't fly."
APRS 7.0 can automate problem detection and resolution across both Java 2 Enterprise
Edition (J2EE)- and .NET Framework-based infrastructures. The product also supports
C++ and Visual Basic development. Gishri said BMC re-architected APRS 7.0 around
a common middle tier -- derived from the Java version of AppSight 6.0 -- to
drive functionality from a single platform, rather than via multiple versions
of the tool.
"We know how to follow requests across platforms," Gishri said of
APRS 7.0. "You will be able to play back the recorded information and follow
the execution between the Windows client and Java-based server back and forth."
No surprise, Gishri has a bird's-eye view of the enterprise development market.
He said he's impressed with how far Microsoft's .NET Framework has come since
its initial launch, which he said was fraught with "glitches and problems."
Gishri said he has noticed a lot of big companies, which once focused tightly
on J2EE for enterprise deployments, shifting attention toward .NET.
"We're actually starting to see an increase in demand for .NET; we're
starting to see some decrease in demand for Java Enterprise Edition. This is
relatively new, only in the last 12 months," Gishri said. "Some of
it is definitely moving to .NET, and some of it is moving to lighter-weight
frameworks like Spring or other open source frameworks that are not that heavy
or that complicated to manage, or not that expensive."
Is your dev shop moving away from J2EE toward other frameworks? And if so,
why? E-mail me at firstname.lastname@example.org.
Posted by Michael Desmond on 04/29/2008 at 4:02 PM18 comments
In the world of theoretical physics, the Theory
is a long-sought, hypothetical model that would elegantly
explain and link all known physical phenomena, from the minute and unpredictable
world of quantum mechanics to the vast energies and scale that define the still-evolving
study of cosmology. It would finally bind gravity into the same system as the
strong nuclear, weak nuclear and electro-magnetic forces. Our whites would be
whiter and our brights would be brighter.
But like so many things in life that seem almost too good to be true, the Theory
of Everything has proven hard to achieve. The area of research has even had
its own Cold Fusion moment when a paper written by erstwhile academic physicist
and now semi-employed surfer dude A. Garrett Lisi drew attention for its exceedingly
simple effort to solve the Theory of Everything.
As it happens, Microsoft is chasing its own Theory of Everything in the form
of Live Mesh. The effort
could help Microsoft break through long-standing boundaries that have prevented
users from freely tapping their data and applications on PCs, appliances and
devices of every stripe.
As RDN contributing editor John Waters reports
from the Live Mesh announcement and demo at the Web
2.0 Expo this week in San Francisco, Live Mesh is a cloud-centric data synchronization
and collaboration services effort that offers a consumer play on Microsoft's
accelerating Software + Services (S+S) strategy.
"In a nutshell, Live Mesh allows individuals, their devices and their
data to become aware of one other, and establish networks to permit file synchronization
across all of it," industry analyst Neil Macehiter told Waters after the
Live Mesh has been two years in the making and is widely
credited to the hide-and-seek visionary genius of Microsoft Chief Software
Architect Ray Ozzie. In fact, this launch may end up being remembered as the
true beginning of the Ozzie era at Microsoft, when the company stopped talking
about open systems and interoperability and really did something strategic about
Notably, Live Mesh promises a cross-platform development environment. Developers
can craft Live Mesh-aware services and applications in Java, Flash, Ruby, Python
and numerous other non-.NET languages. The Mesh Operating Environment that undergirds
Live Mesh services hews to standard fare like the Atom Publishing Protocol,
JSON and RSS. The universe of supported devices and hardware is expected to
be diverse, as well --though today, support is limited to just Windows XP and
But let's not get carried away here. Developers can expect Visual Studio, .NET
languages like C# and VB.NET, and rich Internet application (RIA) platforms
like Silverlight to emerge as first-class citizens in the Live Mesh universe.
What's more, as RDN
columnist Greg DeMichillie writes in his latest column, which will appear
in a future issue of RDN: "Once a developer builds an application
on top of Live Mesh, they are beholden to Microsoft in perpetuity."
Obviously, it's very early in the Live Mesh cycle yet, and in the months to
come we'll see expanding platform support. But I'll be looking forward to hearing
a lot more about the direction Microsoft intends to take with what amounts to
Microsoft's Theory of Everything.
What are your impressions of Live Mesh and what are some of your biggest concerns?
E-mail me at email@example.com.
Posted by Michael Desmond on 04/24/2008 at 4:02 PM1 comments
Howard A. Schmidt has forgotten more about network and systems security than
I will probably ever know. A pioneer in the area of computer forensics, he served
for more than 30 years as an information security advisor to the FBI, the U.S.
Air Force and the Bush administration after Sept. 11, 2001.
Recruited by Microsoft in the mid-'90s, Schmidt served as the company's
first chief security officer and, in April 2001, helped launch the company's
Trustworthy Computing initiative before leaving to become CSO of eBay in 2003.
Today, Schmidt is the president and CEO of R&H Security Consulting LLC.
RDN Senior Editor Kathleen Richards caught up with Schmidt the week after the
RSA Conference to find out where security in a Web 2.0 world is headed.
Here are a few excerpts from the conversation. You can read the entire account
RDN: What kind of tools should developers be using?
We have to look across the entire spectrum. We should not be asking our developers
to develop software and then throw it over the fence and say, OK, Quality Assurance
will find the problems with it. We should be giving the developers the tools
right from the very outset to do the software scanning and the source code analysis.
And that does two things. One, it helps them develop better code as they discover
things through the automated scanning process on the base code itself. But it
also, once it gets to Quality Assurance, gives them the ability to focus more
on quality stuff, then looking at security things which you can eliminate in
the first round.
The second thing, when you look at the compiled binaries and stuff like that,
the way those things work, generally we look at the pen test side of the thing.
We can't ignore that because that is really one of those things when you
put it on the production environment, there may be other linkages somewhere
that may create a security flaw in the business process while the code itself
Then clearly the third level of that is in a Web application, Web 2.0 environments,
for example. Now you have the ability not just to pull information down but
to interact directly -- this creates a really, really dynamic environment, and
even simple things like cross-site scripting and SQL injection have to be tested
for, at the end result once things are out in the wild.
You worked at Microsoft for five years and were one of the founders of
its Trustworthy Computing Strategies Group. Craig Mundie outlined
an "End to End Trust" model at the recent RSA conference. What's your take
-- is there something new there?
I don't know that there is something new. I think it is just a continuation
of the fact that there is no single point solution in any of these things in
any environment. It is not a hardware solution. It is not a software solution.
It is not a business process solution. It is not an identity management solution.
Does Microsoft's recent interoperability
pledge change the security equation?
It does, and that's one of the things when you start looking at one of the complaints
that people had over the years is the inability to write security-related APIs
because they didn't know what it was going to do with the other ones. So having
access to the APIs, knowing what function calls are out there, knowing how the
security that you implement is going to impact that is going to once again take
us a step further.
What did you find noteworthy at the recent RSA Security Conference?
As we develop greater dependency on mobile devices, the bad guys will start
using unsigned applications on the mobile device to commit the next-gen of cyber
crimes and we need to look at it now and build that into the phones that we
will start using in the near future.
You can read the rest of this Q&A here.
What were your impressions from the RSA Security conference? And is your organization
making any changes to help counter emerging threats? Email me at firstname.lastname@example.org.
Posted by Michael Desmond on 04/22/2008 at 4:02 PM0 comments
Microsoft has a long and storied history of creating mind-altering corporate
videos. It runs the gamut, from the slick, minimalist tribute that is the Volkswagen-inspired
Da Da" spot
, to the hopped-on-meth circus show that is Steve Ballmer's
So you can imagine my joy when I was at our RedmondReport.com
news site, which pulls together Microsoft-related news from across the Web,
and found this.
This leaked Windows Vista SP1 sales video could be the best thing I've seen
since the Hillary4U&Me
"fan-paign" video that became a YouTube hit this winter, or the
"Hot! Hot! Hot!"
promotional spot for Appalachian State University, which hit the tubes last
fall. Scratch that -- this internal Vista sales video is even better.
I know a lot of folks will deride Microsoft for making such an ill-considered
bit of derivative promotional media. Others may spend the rest of the afternoon
washing their eyes out with bleach, hoping to wipe the memory of this thing
from their brains (good luck with that).
But not me. Because I know that this is the stuff of Internet greatness. Twenty
years ago, you had no shot at getting a terrifying glimpse inside Microsoft's
promotional machinery. Today, you wish you could avert your eyes.
One thing I do know: I'm going to meet that BitLocker guy in my nightmares.
Is there a corporate YouTube classic that ranks as your worst ever? And by
worst ever, I mean best ever. E-mail me at email@example.com
and share the pain.
Posted by Michael Desmond on 04/17/2008 at 4:02 PM4 comments
Virtualization has gained a lot of traction in the developer community, particularly
in the areas of QA and test. And it's become so ubiquitous in the general IT
space that our parent company, Redmond Media Group, recently launched a new
publication called Virtualization Review
to provide dedicated coverage
of virtualization issues. You can find the Web site here
My question is: Are you, as developers, actively moving to virtualized environments
and solutions as a way to improve productivity, broaden the scope of your work
and achieve higher efficiencies? If so, we want to hear from you.
Tell us how you're making use of virtualization in your development operations
and what you want to see tool vendors provide. This is your chance to tell the
industry what you need to get work done. E-mail me at firstname.lastname@example.org.
Posted by Michael Desmond on 04/17/2008 at 4:02 PM0 comments
Back in June of last year, we featured
in the pages of Redmond Developer News
, profiling him
and his popular Coding Horror developer
in the Cool Developer Tricks section of RDN
At the time, Jeff had expressed a concern that some of the big-name dev bloggers
he looked up to were busy running their own software companies and had a lot
more going on than he did. Now it seems that Jeff is onto a few small things
of his own. Since our interview, Jeff has moved on from his position as a senior
technical evangelist at Vertigo Software to devote more of his time to blogging
and pursuing an open source project of his own, called Stackoverflow.com.
And, as he recounts in an April
10 Coding Horror blog entry, Jeff has recently awarded a $5,000 grant to
Dario Solera of the ScrewTurn Wiki project, which is developing an ASP.NET-based
Wiki engine. The award is part of an announced program to recognize outstanding
and important efforts in the field of .NET open source development.
Here's where it gets interesting. Jeff contends that "open source projects
are treated as second-class citizens in the Microsoft ecosystem." He says
Microsoft is not only wrong to withhold support from open source projects that
contribute to the .NET universe; he believes Microsoft's fate as a dev tools
provider hinges on the company changing its approach.
It's a point worth discussion. Dev shops worldwide rely on diverse open source
tools like DotNetNuke, MbUnit, NAnt, NHibernate and ZedGraph, just to name a
few. And yet, for all of Microsoft's efforts to embrace, welcome and work with
the open source community (CodePlex, the IronPython and IronRuby projects, Mono
development, etc.), it's clear that the .NET-aligned, open source developer
community isn't feeling the love from Redmond.
Are you using or considering the use of open source tooling in your development
projects? Tell us how you are using these .NET-savvy tools either alongside
or in place of Microsoft's own products. Let me know at email@example.com.
Posted by Michael Desmond on 04/15/2008 at 4:02 PM10 comments
It's the kind of story that should rightly give anyone the chills. Yesterday
at the RSA Conference
in San Francisco,
penetration testing expert Ira Winkler told the audience that the networks of
power companies are vulnerable to attack.
He should know. Winkler, you see, was able to hack into one such network in
less than a day.
Winkler and his team, working at the company's behest, were quickly able to
gain access to several employees' systems -- by way of a simple phishing attack.
From there, they could access the network controlling the power station's monitoring
and distribution operations. And from there, a lot of things -- mostly bad --
can happen. You can read a Network World article about Winkler's presentation here.
The problem, Winkler contends, isn't so much with gullible employees who should
know better than to click a link on a faux e-mail message. It's with the slap-dash
evolution of systems and networks at the power companies. As Winkler explains
in a 2007
blog post, the Supervisory Control and Data Acquisition (SCADA) systems
employed inside power companies are no longer isolated from external threats.
The air gap that once protected these systems has been bridged by what Winkler
calls the "lazy and cheap" behavior of people at these companies.
The worst thing? Winkler says power companies' fear of service interruptions
makes them reluctant "to update SCADA systems, and the systems and networks
that support them." It's a recipe for disaster that Winkler has urged power
companies to uncook. He calls for SCADA systems to be unlinked from the public
network and for power companies to deploy software and systems that enable reliable
and rapid patching.
What do you think of Winkler's warning to the power industry? And what can
development managers do to ensure that critical systems like these prove less
susceptible to attack? E-mail me at firstname.lastname@example.org.
Posted by Michael Desmond on 04/10/2008 at 4:02 PM0 comments
Microsoft is, of course, a leader in the arena of IT and software development.
And yet, I often feel that Microsoft runs the business the way I drive in rush-hour
traffic -- an abrupt, panic-filled drama punctuated by angry shouting and the
occasional triple-lane change. Still, the company almost always seems to get
where it's going.
In the mid-'90s, Microsoft reworked its entire MSN strategy and launched Internet
Explorer in response to the rise of the World Wide Web and Netscape. Just six
weeks ago, Redmond suddenly announced a strategic
interoperability pledge in response to competitive and regulatory pressure.
And now today, at the RSA
Conference keynote in San Francisco, Microsoft Chief Research and Strategy
Officer Craig Mundie talked about what the company calls its "End to End
We've heard this word "trust" before. It emerged in January 2002
in the now-famous Bill
Gates e-mail that, for the first time, placed security ahead of functionality
in the Redmond product development stack. As Gates wrote at the time:
"Trustworthy Computing is the highest priority for all the work we
are doing. We must lead the industry to a whole new level of Trustworthiness
That e-mail helped launch the Security Development Lifecycle (SDL) at Microsoft,
and was critical in cleaning up the mess in such strategic products as Microsoft
Office, SQL Server and IIS.
Now, Microsoft is broadening the challenge, seeking to drive a strategic discussion
of privacy and security to the larger Internet. In the months ahead, you'll
hear a lot of talk about the "trusted stack" and what it will take
to achieve what Microsoft calls a "more secure and trustworthy Internet
This is a big topic Microsoft is taking on, and one that we should all make
a point to pay very close attention to. The idea of a trusted stack certainly
brings with it a host of integration, leverage and lock-in opportunities for
Microsoft. But it also invites a true "lift all boats" scenario, where
Microsoft may stand to profit most by working closely with, rather than competing
against, its fiercest rivals.
If there's one truism in all of these developments, it's this: The work is
never done. On the same day Microsoft announced its End to End Trust vision,
the company sent out a Patch Tuesday security bulletin detailing five
critical security vulnerabilities.
What do you think of Microsoft's End to End Trust vision and its push for a
trusted stack? Speak up at email@example.com.
Posted by Michael Desmond on 04/08/2008 at 4:02 PM3 comments
Maybe it's because I'm a middle child in an angry, Irish family, but I've always
played the role of diplomat. Whether it's soothing tempers around the dinner
table or hoping to find common ground in a heated political discussion, I'm
not one to admire intransigence.
So imagine my dilemma covering the ongoing push to make Office Open XML (OOXML)
an ISO standard. After talking to some of the brightest minds in the industry,
I've come to an unsatisfying conclusion: Smart people can, and often must, disagree.
And sometimes, they must disagree violently.
Which helps explain the invective coming out of the open source and OpenDocument
Format (ODF) community this week, in the wake of the April 1 announcement by
the ISO that OOXML had, indeed, won
approval as a standard.
Sun's Tim Bray, who represented Canada in the ISO Ballot Resolution Meeting
in his blog back on Feb. 29 at the conclusion of the BRM session:
"The process was complete, utter, unadulterated bullsh*t. I'm not
an ISO expert, but whatever their 'Fast Track' process was designed for, it
sure wasn't this. You just can't revise six thousand pages of deeply complex
specification-ware in the time that was provided for the process."
essay offers a more balanced perspective, though his criticism of the technical
Meanwhile, guys like Mono Project founder Miguel de Icaza praise
the technical worth of OOXML and earn a firestorm of scathing critique from
open source advocates. Andrew Brust, an RDN contributor, Microsoft regional
director and chief of new technology at consultancy twentysix New York, said
of the process that Microsoft was forced to counter targeted opposition from
competitors and open source advocates.
"I think the worst you can say about that effort was that it was necessary
to make the vote fair, and it was unfortunate that the OOXML standard could
not be judged exclusively on its technical merits," Brust said. "Were
it judged that way, without the politics, I think it would have won approval
[in the first round of voting], and done so with much less rancor."
OOXML has passed muster in the ISO, and Microsoft is, predictably, calling
for people to set aside their differences in the ratification process and move
forward. But with the European Commission looking
for signs that Microsoft abused its monopoly position in the ISO process,
and the real possibility of an appeal being filed, it's clear that the healing
process may take longer to start than even a diplomat like myself might hope.
E-mail me at firstname.lastname@example.org.
Posted by Michael Desmond on 04/03/2008 at 4:02 PM4 comments
Yesterday, we were all expecting to hear the final results of the vote to make
Microsoft Office Open XML (OOXML) an industry standard under the International
Organization for Standardization (ISO). That announcement has been put off until
tomorrow, but rumors
that Microsoft may have narrowly won the support it needed to gain
We'll be covering the ISO process and its implications in our next issue of
Redmond Developer News magazine, but in the meantime, I'm left to wonder:
What do developers think of the noisome ballot process that has brought OOXML
to the doorstep of ISO ratification? What could have been done better? And what,
perhaps, shouldn't have been done at all? E-mail me at email@example.com.
Posted by Michael Desmond on 04/01/2008 at 4:02 PM0 comments
I've spent the last couple of days in San Francisco at VSLive!
which offers Visual Studio developers a chance to glimpse Microsoft dev tool
roadmaps, hone technical skills, and explore important new tooling like Language
Integrated Query (LINQ) and the latest version of Visual Studio Team System
(VSTS). Along the way, attendees also get a chance to voice their opinions about
the tools they use every day.
All of that was in evidence at the session Stephanie
Saad gave on Monday afternoon. The VSTS group manager is heading up work
on the next release of VSTS (codenamed "Rosario"), and she was actively
working the audience to get a sense of what they wanted, and wanted changed,
in the upcoming toolset.
While her early demos of VSTS-Microsoft Project integration fell a bit flat,
Saad pleased the crowd when she showed off the promised reporting tools within
the next Team System version. And no wonder: When Saad asked developers if they
struggled to author reports, one attendee replied flatly, "We gave up."
Saad then demoed the slick integration of Excel to display flexible and compelling
report charts from simple queries, drawing applause from developers.
There's more, of course, including SharePoint integration for project dashboarding,
promised client-side code search and enhanced test functionality with a focus
on manual testing. Ultimately, Saad noted, Microsoft's goal isn't to produce
best-of-breed tooling across VSTS, but rather to deliver the most well-integrated
In short: Development managers will face some tough choices in the years ahead
as they weigh the benefits of a focused testing suite like Identify Appsight,
for example, against the across-the-board plug-and-play value offered by Rosario.
What do you think of Microsoft's efforts with VSTS? Is the company focusing
on the right things or are there specific areas it really needs to address?
E-mail me at firstname.lastname@example.org.
Posted by Michael Desmond on 04/01/2008 at 4:02 PM0 comments
If Google has taught the world one thing, it's this: Search is good.
So I stood up and took note last month when Krugle
introduced the second version of its Krugle Enterprise Appliance, an enterprise
search network device that lets developers and managers track down specific
code assets across repositories and over the Internet. The appliance can search
enterprise code indexed behind the firewall, and also provides access to Krugle's
public index of more than 2.6 billion lines of open source code.
According to Matthew Graney, Krugle's senior director of product management,
Krugle Enterprise Appliance 2.0 extends native support for source code management
systems, adding ClearCase and Microsoft Foundation Server to support for Subversion,
Perforce and CVS. Krugle search supports more than 40 languages. It also features
improved management and configuration features.
Graney said the product meets a long-standing need among development teams.
"Whether you are doing open source or other kinds of development, the biggest
challenge facing developers is often finding what is out there," Graney
said. "That is really the primary benefit."
Honestly, I'm surprised we don't hear more about code search solutions, especially
given the increasingly distributed and componentized nature of development.
Tim Murphy, a senior consultant for Daugherty Business Solutions, agreed. He
said effective code search is a great way for developers to get past bottlenecks
and learn from each other.
"The most productive developers I have met are where they are because
of the resources that they utilize," Murphy wrote in an e-mail interview.
"Reading other people's code is a great way to find solutions but it is
also a way to get new ideas for approaches to development."
More than that, a device like Krugle's, which plugs into the datacenter and
crawls code repositories and other sources, can help large organizations do
a better job of reusing code. Michael Cote, analyst for research firm RedMonk,
sees developers searching across business units and projects to find blocks
of code that address a problem at hand.
Of course, Web search helped turn keyword- and key phrase-writing into a profit-making
art, and launched a cottage industry of search engine optimization consultancies.
I imagine code search will put a real emphasis on well-commented and organized
Does your shop or projects make use of code search solutions? What would you
like to see to help you do your jobs better? E-mail me at email@example.com.
Posted by Michael Desmond on 03/27/2008 at 4:02 PM0 comments
Unless you've been hiding under a rock, you've almost certainly noticed Microsoft's
shift toward a more
, interoperable and standards-savvy approach to development technologies.
It's a trend that started with the release
of the .NET source code and promotion of XML-based
open file formats for Office, and recently culminated in the Feb. 21 "Interoperability
What's remarkable isn't that Microsoft is playing nicely with others; the company
has enjoyed a long history of fruitful partnerships, often with unexpected partners
like Apple, Red Hat and Sun Microsystems. Rather, it's the way that Microsoft
seems to be turning its vast and remarkable ecosystem toward the emerging threat
and opportunity posed by open source development and standards-based Web services.
You see, in the past, Microsoft would identify a key market opportunity or
threat and target it with a compelling offering. So Microsoft Excel was set
against Lotus 1-2-3, MSN was set against AOL and Internet Explorer competed
with Netscape Navigator.
But how does Microsoft compete against a movement? The rapidly growing web
of standards and open source solutions serving the enterprise poses a real danger
If You Can't Beat 'Em, Join 'Em. Then Beat 'Em.
Unable to simply set a product against this broad phenomenon, Microsoft instead
mirrors the actions of Daniel Plainview in the acclaimed movie There Will
Be Blood, and proposes to drain the momentum and revenue right out from
under the competition.
The Office Open XML (OOXML) file format is a case in point. Faced with an established
open source XML standard (OpenDocument Format) and the real possibility that
government and regulatory bodies might mandate a move off the proprietary binary
Office formats, Microsoft launched OOXML as an industry standard. More than
that, it's moved aggressively to deploy an ecosystem of partners, on display
at the recent Interoperability Lab in Cambridge, Mass., to surround ODF's position.
You can almost picture Daniel Day-Lewis describing his long straw as it reaches
across the room and saying, "I drink your milkshake. I drink it up!"
Just as Plainview surrounded a sought-for oil field and drained its wealth
right out from under the owner's deed, Microsoft is shifting the competition.
Instead of simply going head-to-head against ODF or Web services or competing
browser platforms, Redmond is going over, under, around and into the competition.
It's co-opting standards-making processes and adopting mature industry standards
even as it opens access to its once-hidden IP.
The good news for developers is that we can expect Microsoft to become increasingly
engaged with both open source- and industry standards-based projects and efforts.
We can expect to enjoy broader and deeper access to Microsoft IP than ever before.
And we can expect more partnerships like the PHP project with Zend Technologies
and the Mono and Moonlight projects with Novell. This is all welcome news.
But don't think for a moment that Microsoft isn't anxious to monetize or leverage
every precious ounce of IP it can muster.
In less than a week, we should learn the results of the long-running OOXML
standardization battle in the ISO. Win or lose, this is just the beginning.
Even without ISO imprimatur, Microsoft will make its file format spec a major
contender. It's breaking down the competitive value of ODF by opening its own
standard, by enabling an ecosystem of rich solutions around OOXML and by making
available the vast leverage of the Office and Windows platforms.
And Microsoft is working to apply this same model across the entire enterprise
Web space, from Web services to rich Internet application runtimes. To turn
an old phrase: If you can't beat 'em, join 'em. Then beat 'em.
What do you think of Microsoft's open and interoperable strategy? E-mail me
Posted by Michael Desmond on 03/25/2008 at 4:02 PM2 comments
Even as Mozilla fixes and tweaks
its own code
, the open source purveyor is motivating developers to do the
same with their add-ons and extensions. Mozilla just announced the Extend Firefox
3 contest, which rewards developer efforts for creating outstanding add-ons
for the forthcoming release of Firefox.
The contest will run through July 4 and emphasizes outstanding UI, innovative
approaches and the use of open standards. A new category will recognize the
best updated versions of existing add-ons.
You can find out more details at extendfirefox.com.
Posted by Michael Desmond on 03/20/2008 at 4:02 PM0 comments
When the Mozilla development team started work on Firefox 3 back in 2005, one
of the key issues facing the group was the issue of memory management. Shifting
usage patterns and increasingly demanding Web environments had exposed issues
like insidious memory leaks and application processes that failed to let go
of allocated memory. The result: Degrading performance and concerns about stability.
About a month ago, Mozilla developer Stuart Parmenter wrote an informative
account of how his team went about attacking the memory allocation and handling
problem. You can read it on his
I spoke with Stuart and with Mozilla Vice President Mike Schroepfer yesterday
about the Firefox 3 development process. To improve memory performance, the
team had to go beyond tweaking their code and build custom tooling to help them
measure and address the challenge. It wasn't a trivial task, Parmenter said.
"I would say it was pretty difficult. We had to take a look at a pretty
low level at what was going on. So we actually had to end up building a lot
of custom tools to monitor when our our memory was being allocated, how it was
being allocated, how it was being freed and what the result of that was,"
Parmenter explained. "So a lot of really deep technical issues there. We
built a lot of tools to measure that."
The results are apparent to many users working with the current beta 4 of
Firefox 3, but Parmenter's blog post offers pretty compelling quantitative results,
as well. His team cooked up an aggressive series of tests that reflect the evolving
nature of Web browsing, and measures how well the code behaves when juggling
multiple sites in multiple tabs over a long period of time. According to their
tests, Firefox 3 does a much better job of releasing memory, with a terminal
state that's 60 percent below that of Firefox 2.
What might it all mean for corporate developers and managers? Parmenter and
Schroepfer said that all development efforts can benefit from the kind of continuous
build process that's been in place at Mozilla throughout the Firefox 3 effort.
The team produces nightly builds of Firefox, Schroepfer said, and has kicked
out eight alphas and four public betas in the time that Microsoft has produced
one public milestone of Internet Explorer 8.
What are your impressions of the latest Firefox beta? Is the Mozilla team onto
something, and can we all learn something from them? E-mail me at firstname.lastname@example.org
and let me know why or why not.
Posted by Michael Desmond on 03/20/2008 at 4:02 PM0 comments