Database Jobs Provide Job Security, Says Survey

A recent salary survey indicates that database-related jobs provide good job security, and don't rank too badly on the salary side of things, either.

Visual Studio Magazine's 2012 .NET Developer Salary Survey noted that, "In terms of top job functions for security and retention, database administrator/developer ranked highest (46.5 percent), followed by senior engineer/senior software developer (43.5 percent) and software architect (43 percent)."

As far as technologies that provided perceived job security/retention, SQL Server was No. 2.

Salary-wise, the average base for database administrator/developer types was $91,276, pretty much aligned with the median base salary of all respondents, $92,000.

That compares to a $95,212 average base salary reported by database developers in Redmondmag.com's 2011 Windows IT Salary Survey last August. Interestingly, in that survey, the data devs' salary had fallen from No. 1 the previous year to No. 4.

Some more tidbits for you data types in the new .NET developer survey:

"Only 4.2 percent of survey respondents categorized their role as database administrator/developer. However, 67.5 percent of 1,104 respondents reported a background -- they had worked on a project for at least six months -- in database development: 45.3 percent in database administration and 24.2 percent in data warehousing."

It seems to me in this still-shaky economic climate that high job security is comparatively better than a high salary. Remember, if you're a working database developer, you're lucky to have a job, and probably thousands of equally qualified unemployed workers would gladly trade places with you at just about any salary.

Or, as one respondent put it, "There is a salary freeze and I do not anticipate any changes (which is fine with me ... I'm employed)."

What is it about the database field that provides (relative) job security? Comment here or drop me a line.

Posted by David Ramel on 01/12/2012 at 12:52 PM2 comments


Oracle Developers Playing in the Microsoft Sandbox? Indeed.

As Microsoft continues to make news about opening up its developer technologies (the latest being opening its Windows Azure cloud platform to Linux servers), it's easy to forget how the process works both ways. Witness last week's under-the-radar release by Oracle of the production data provider "for Entity Framework and LINQ developers." This lets Oracle developers do all their work in Visual Studio for certain projects while taking advantage of almost all the latest Microsoft database APIs.

My, how open source has changed things. Remember the old days when proprietary software vendors fought tooth and nail to convert users to their proprietary technologies? For you database developers, it used to be Microsoft (SQL Server) vs. Oracle vs. Borland vs. Sybase, and, on a broader scale, it evolved into .NET vs. Java. Developers were firmly entrenched in one camp or the other and felt free to viciously (and usually anonymously) flame the non-believers in forums, comments and blog posts.

Now, it seems, every software development tool will soon just work with every other software development tool. We're heading for one big, happy family of developers.

Anyway, back to the news of special importance to you data developers. I guess Oracle decided to bury the announcement of "ODAC 11.2 Release 4 and Oracle Developer Tools for Visual Studio (11.2.0.3.0)" because the beta has been out for quite some time. The 11.2 Release 3 download was posted exactly a year earlier.

Release 4 "introduces tools and data provider support for ADO.NET Entity Framework, Language Integrated Query (LINQ), and WCF Data Services," according to an Oracle data sheet (PDF here).

The release's database client works with Oracle Database 9.2 and above. On the Microsoft side, it supports Visual Studio 2010 and the .NET Framework 4, with support for Entity Framework 4.1 and 4.2. It also supports OData, LINQ to Entities and "implicit REF CURSOR parameter binding." However, it doesn't support some of the newer Entity Framework features, such as Code First and (apparently) DbContext. (Non-support of the latter isn't mentioned explicitly in the latest announcement, but it wasn't included in earlier versions.)

To show developers how to use the Entity Framework with the data provider, Oracle has posted this article and an "Entity Framework, LINQ and Model-First for the Oracle Database" tutorial. Much more related information can be found at the Oracle Data Provider for .NET Developer's Guide.

The new production release comes in 32-bit and 64-bit downloads, with different installer/deployment options, including Xcopy.

The Oracle data provider is just one of about a dozen third-party ADO.NET providers, including MySQL.

What do you think of Oracle's support for Entity Framework and move toward more interoperable technologies in general? Comment here or drop me a line.

Posted by David Ramel on 01/05/2012 at 12:52 PM0 comments


Microsoft's Windows Azure Leads the Data Revolution

It was about two years ago when I first wrote about the exciting development possibilities of "Mining the Cloud," with new data markets such as the "Dallas" project on Windows Azure.

Well, Dallas has matured into the Windows Azure Marketplace, and at least one forward-looking research organization is predicting the fruition of that effort into something really big. One of O'Reilly Radar's "Five big data predictions for 2012" published last week is the "Rise of data marketplaces." It reads:

"Your own data can become that much more potent when mixed with other datasets. For instance, add in weather conditions to your customer data, and discover if there are weather related patterns to your customers' purchasing patterns. Acquiring these datasets can be a pain, especially if you want to do it outside of the IT department, and with some exactness. The value of data marketplaces is in providing a directory to this data, as well as streamlined, standardized methods of delivering it. Microsoft's direction of integrating its Azure marketplace right into analytical tools foreshadows the coming convenience of access to data."

Indeed, from the "dozens of feeds" I discovered in my initial exploration of Dallas, Windows Azure Marketplace now boasts "thousands of subscriptions and trillions of data points," with more coming online regularly, such as historical weather data and a "Stock Sonar Sentiment Service" added last month.

Two years ago I demonstrated how easy it was to subscribe to a data feed and incorporate it into custom reports and visualizations. Imagine what developers can do now.

While Microsoft may be the vanguard of new data-centric initiatives, it's not alone, of course. ReadWriteWeb summarized the emerging data market ... uh, market that developers might tap into in this July piece, and reviewed some of the other players such as Datamarket.com, Factual, CKAN Data Hub and Kasabi. But looks like Microsoft is indeed the frontrunner. The site even wondered "Is Microsoft's Future in Data-as-a-Service?"

But one worrisome trend that could curtail this movement is the possible loss of hundreds of thousands of raw data sources that come from the federal government as the tanking economy threatens to impose cost-cutting measures that will eliminate or severely curtail services such as Data.gov. "When the current budget cuts were revealed to include cuts to the e-government fund that supports Data.gov, everyone starting questioning Data.gov's value," reads a blog posting from the Sunlight Foundation last April when budget cuts were announced. "The cuts could spell the end of Data.gov," warned a Washington Post blog at the time. And this is with a Democrat in the White House!

The site is still up for the time being, but it's somewhat alarming that the last blog posting on the Data.gov site's Open Data section announced the resignation of the program executive last summer. And there's little activity on the forums in the "Developer's Corner" section of the site.

But with demand, there will be supply, of course, so data markets such as Windows Azure Marketplace will continue to provide valuable information that can be incorporated into exciting new development opportunities -- you just might have to pay more for less. But that's nothing new these days.

What do you think about the Windows Azure Marketplace and data markets and opportunities for development of new apps? What's the coolest app you've found that utilizes this data? Do you think the government should continue to fund sites such as Data.gov in this dire economy? Comment here or drop me a line.

Posted by David Ramel on 12/20/2011 at 12:52 PM1 comments


SQL Azure Gets Tune-Up

There were a few database-related goodies in Microsoft's announcement today about multiple Windows Azure updates, including a new Metro-like UI for the management portal, SQL Azure Federation, increased database size and lower cost-per-gigabyte for the biggest databases.

The Metro-style UI for the SQL Azure Management Portal includes new features such as "new workspaces with the ability to more easily monitor databases, drill-down into schemas, query plans, spatial data, indexes/keys, and query performance statistics," said an announcement in a Windows Azure blog post by Bob Kelly. The post explained that the updates were part of the new "SQL Azure Q4 2011 Service Release," the details of which were posted on another page, by Gregory Leake.

The size of the largest allowable increases to 150GB from 50GB, Leake said, while a new price cap will decrease the cost-per-gigabyte by 67 percent for the biggest databases. The cap is $499.95 per month.

SQL Azure Federation means "databases can be elastically scaled out using sharding based on database size and application workload," the post said. Federation will be supported in the new portal.

Other improvements include an updated CTP for the DAC Import/Export Service, which reportedly fixes several issues and allows easy import and export of databases between SQL Azure and BLOB storage.

Also, user-controlled collations are now supported, which means users can choose which type of collation to use when creating databases.

Microsoft said to stay tuned for more posts explaining SQL Azure Federation and the new management portal in more detail.

Posted by David Ramel on 12/12/2011 at 12:52 PM2 comments


Linux Added to the SQL Server Driver Parade

It took about three years from the release of the first Windows-specific SQL Server to a kind of opening up of the architecture with the inclusion of an Open Database Connectivity (ODBC) driver with SQL Server 7.0 in 1998. Some 13 years later, Microsoft has released the first preview of an ODBC driver for Linux.

Announced at the PASS conference in October, the Linux driver was released earlier this week. Specifically, it's a 64-bit driver (32-bit is planned) only for Red Hat Enterprise Linux 5, but it's a start.

This is just the latest in an openness campaign underway (or what it calls "Microsoft's Commitment to Interoperability") at Microsoft, something that would've been unheard of not that long ago, it seems. At about the same time as the Linux announcement, the company dropped the CTP3 of the JDBC Driver 4.0.

In August 2010, Microsoft Drivers for PHP for SQL Server 2.0 were released, for the first time including the addition of the PDO_SQLSRV driver, which supports PHP Data Objects.

A few months ago, Microsoft announced it was jumping all the way onto the ODBC bandwagon and planning to phase out the OLE DB technology it invented.

And, of course, I recently wrote about another opening up of SQL Server: the discontinuation of the LINQ to HPC project, replaced by support for the open source Apache Hadoop "big data" technology.

You can read more about Microsoft's database connectivity initiatives for ODBC, Java, PHP and more here. The company just continues to embrace new technologies and attract new developers. Welcome to the party.

What's the next open source move you'd like to see Microsoft make? Comment here or drop me a line.

Posted by David Ramel on 12/01/2011 at 12:52 PM2 comments


Microsoft Says It's Serious About Hadoop

The SQL Server world was abuzz lately with last week's announcement that Microsoft was discontinuing its LINQ to HPC (high performance computing) "big data" project in favor of supporting the open source Apache Hadoop in Windows Server and Windows Azure.

This was an interesting development in the larger context of Microsoft's turn-around embrace of the open source world and many who have questioned its motives and commitment (remember long-ago headlines such as "Microsoft raps open-source approach"?).

But if Denny Lee is representative of Microsoft's motives and commitment, it seems pretty genuine to me. Check out the blog he posted earlier this week, "What's so BIG about 'Big Data'?"

"We are diving deeper into the world of Big Data by embracing and contributing to the open source community and Hadoop," Lee said. And under a heading of "Openness - yes, we're serious about it!", he said "A key aspect is openness and our commitment to give back to the Open Source community." He then talks about Microsoft's participation in last week's "ultimate open source conference," ApacheCon North America 2011.

Lee said Hadoop is important to his Customer Advisory Team because "it is important for our customers," which may sound like marketing-speak, but he notes "we work on some of the most complex Tier-1 Enterprise SQL Server implementations" and goes on to discuss technical aspects of projects such as Yahoo's "largest known cube."

Lee explained more on his personal blog about why he left the BI world to so enthusiastically embrace open source: "It's about the openness of the Open Source community (apologies for the pun) that allows us to focus on solving the actual problem instead of trying to understand how a particular system works."

So say what you will about Microsoft and its marketing strategies, it looks to me like the company has some good people who are doing good work to solve problems that affect real-world users, regardless of the technology used. Sure, it might be a matter of survival in the new IT world, but if it benefits you, take it and run.

What do you think about Hadoop? Comment here or drop me a line.

Posted by David Ramel on 11/17/2011 at 12:52 PM1 comments


Developers Offered Pay-For-Use Database Cloud Service

Coinciding with a new SQL Server 2012 licensing model, OpSource Inc. introduced a cloud-based service that offers developers and others purportedly cheaper pay-as-you-go access to major database systems.

Called OpSource Cloud Software, the new product offers access to Microsoft SQL Server 2008 R2 Standard and other software. OpSource said the cloud service is "ideal for testing and development" in a news release.

While SQL Server 2012 comes with two licensing options--"one that is based on computing power, and one that is based on users or devices," according to a six-page datasheet--Cloud Software is available with hourly and monthly on-demand charges, OpSource said. According to a company Web site, SQL Server 2008 R2 costs 66 cents per hour per server. The pricing scheme is a little confusing to me, however. Although the news release stated: "Per Server priced Cloud Software incurs a specific rate per hour when a server is running and a specific rate per hour when a server is stopped," I couldn’t find any information about the rate for a stopped server. So I chatted with Chris, who kind of cleared it up a little, maybe, I think:

You are now chatting with 'Chris'

Chris: Thank you for your interest in OpSource. How may I help you?
me: I'm interested in the Microsoft SQL Server 2008 R2 Cloud Software product. How much is the hourly rate for a stopped server?
Chris: Well for the SQL server license, it has a built in rate of 0.66 cents per hour
Chris: And there will be additional costs for the device footprint as well
Chris: In regards to storage, CPU, and RAM
Chris: In a stopped state, you only pay for the storage
me: For a running server or stopped server? Your news release said there are two different rates for these?
Chris: You will pay the cost of the storage footprint in a standby state
me: What is that pricing structure, for storage?
Chris: However, you will be committed to a 0.66 cent rate even if the device is on standby for SQL
Chris: Well you are only being charged based on the footprint
Chris: Generally, the cost is close to 21.6 cents per GB
Chris: Per month
me: OK. One more question: Do you plan on offering SQL Server 2012 when it's available next year?
Chris: I'm not sure at this moment, I would anticipate us keeping up to date with that version in our new Application Layers
Chris: If you are interested, I can provide you with some trial credit to sandbox the environment
me: No thanks. That's all I had. Bye.
Chris: If you apply our promo code, you can get $200 worth of credit
Chris: Thank you for visiting. Please contact us at anytime.

Our questions and answers got a little out of sync (the chat box didn’t have one of those helpful "Chris is typing" indicators, so I asked more questions before I knew he wasn’t done replying), but you might get the idea, sort of, I hope.

The Cloud Software service also offers several editions of Oracle database products, with "monthly pricing based on number of processors, sockets and server configuration."

OpSource said the SQL Server product "supports up to four processors, up to 64 GB of RAM, one virtual machine, and two failover clustering nodes." It comes bundled with a Windows Server 2008 R2 image.

What do you think? Could this be a cheaper way for developers to test their SQL apps in a pseudo-production environment? Or would you be likely to forget to turn off a server and get one of those nasty cellphone-service-like bill shocks? Comment here or drop me a line.

Posted by David Ramel on 11/10/2011 at 12:52 PM1 comments


Developers Can Test 'Denali' in Amazon Cloud

Microsoft and Amazon are collaborating to offer developer testing of the next version of SQL Server in the Amazon cloud, promising an easier and cheaper evaluation than you could get with a local implementation.

The marriage of Microsoft SQL Server "Denali" (now, SQL Server 2012) and the Amazon Elastic Compute Cloud means developers only have to pay standard Amazon Web Services (AWS) rates to test the beta database software, currently in Community Technology Preview 3. AWS pricing for "standard on-demand instances" ranges from 12 cents to 96 cents per hour.

An AWS site promises easy deployment in five minutes. "With AWS, companies can utilize the Cloud to easily test the new functionality and features of 'Denali,' without having to purchase and manage hardware," the site says. "This provides customers with faster time to evaluation, without any of the complexity related to setting up and configuring a test lab for beta software."

Sounds good to me. I earlier wrote about how a beta evaluation of SQL Server nearly wrecked my system and caused hours of frustration (for me and many others) when I tried to remove it and install the free, Express version.

The Denali program is part of a broader initiative in which Microsoft has developed Amazon Machine Images (AMI) for testing of Web-based products such as WebMatrix and database-related software--basically SQL Server 2008 R2--all running on Windows Server 2008 R2. The Denali AMI was created just a couple weeks ago.

Have you tried testing any Microsoft products on the Amazon cloud? We'd love to hear about your experience. Comment here or drop me a line.

Posted by David Ramel on 10/27/2011 at 12:52 PM0 comments


Some Bumps in the Separation of Entity Framework and .NET Framework

It's almost like a feuding spouse who leaves the partner only to find out how much they're missed and decides not to cut ties completely and maybe hang out with each other now and then. Well, almost.

The Entity Framework team disassociated itself from the .NET Framework release schedule after EF 4.0 was released with .NET 4.0. The first manifestation of that new policy came last spring when the EF team released an update, EF 4.1, with developer-requested improvements such as Code First capability and a DbContext API.

"This is the first time we've released part of the Entity Framework as a stand-alone release and we're excited about the ability to get new features into your hands faster than waiting for the next full .NET Framework release," said a posting on the ADO.NET team blog announcing EF 4.1. That was followed up in August with the release of the EF 4.2 Beta 1 preview.

But today comes news that the trial separation didn't work so well and some new EF features--including much-wanted enum support--will have to wait for a full .NET Framework upgrade.

"Our new features that require updates to our core libraries will need to wait for the next .NET Framework release. This includes support for Enum Types, Spatial Types, Table-Valued Functions, Stored Procedures with Multiple Results and Auto-Compiled LINQ Queries" reads an entry on the ADO.NET team blog. [Editor's note: The preceding italicized text was changed due to an error; the italicized text that follows was also changed and refers to this same blog post. We apologize for the errors.]

The post explained that the EF team at first wanted to address these core library updates with a separate, full release of EF instead of waiting for .NET 4.5. The June EF Community Technology Preview was the result, offering up that "The Enum data-type is now available in the Entity Framework."

Well, not so fast. "While we are still pursuing this option it has become clear that from a technical standpoint we are not ready to achieve this immediately," the post said. No details about the technical problems were mentioned. The aforementioned list of EF enhancements "will reappear in a preview of Entity Framework that we will ship alongside the next public preview of .NET 4.5," the post said. The post didn't indicate when that might be.

The .NET Framework 4.5 developer preview was introduced in September at the BUILD conference.

What do you think of the EF and .NET Framework previews? When do you think you'll finally get that enum support? Comment here or drop me a line.

Posted by David Ramel on 10/20/2011 at 12:52 PM0 comments


Google, Apple Play Catch-Up to Microsoft (for a change)

I'm no Microsoft fanboi, but I noticed an interesting tidbit when I recently wrote a news article about Google Cloud SQL, which adds a MySQL database service to the company's App Engine development stack.

In the comments section of the blog post announcing the new service, was this from reader Jeff King:

"Microsoft has had SQL Azure for ages so why would you need this?"

Now that's a switch. Usually it's the other way around: The slow, ponderous, bureaucratic, out-of-touch Redmond software giant is chastised for being behind the times and playing clumsy catch-up to the hip, nimble Web 2.0 pioneer.

Indeed, SQL Azure was introduced in March 2009. Truth be told, after Amazon basically pioneered the cloud phenomenon in 2006, Google beat Microsoft to the punch in the fight for the sky when it introduced App Engine in April 2008, about six months before Windows Azure was unveiled.

But, looking at the database component, it's clear that Microsoft has had a leg up on Google, which heretofore offered a datastore with a syntax similar to SQL called GQL. OK, how many of you developers have liked, or even used, GQL? Raise your hands (or flame me; your choice).

"One of App Engine's most requested features has been a simple way to develop traditional database-driven applications," said the Google Cloud SQL program manager in the previously mentioned blog post. Well, yaaah!

And today I noticed a news report that Apple is preparing to launch its iCloud. I know the products don't really compare--with Apple's focus on music and consumer entertainment as opposed to enterprise development--but launching an iCloud service in late 2011 seems a little iBehind.

And stodgy old Microsoft seems to have acquitted itself well in the cloud despite its late start, judging from this recent Ars Technica headline: "Windows Azure beats Amazon EC2, Google App Engine in cloud speed test."

I've even noticed some positive buzz about Windows Phone in the media as of late. Is Microsoft finally turning things around, like a huge supertanker that takes miles to change direction? Will it (gasp!) become cool? Well, let's not go overboard here.

What do you think about Microsoft: dying dinosaur or comeback kid? Comment here or drop me a line.

Posted by David Ramel on 10/13/2011 at 12:52 PM3 comments


Not Your Typical Data Driver Column

Dear ‹FirstName>,

In these trying times you occasionally just need to take a break from the business of data and have a good laugh. Which is what I did when I received the following e-mail, purportedly from a real data-related vendor. I'll protect that innocent by anonymizing the company/personal details in italics, but otherwise the message is presented as received:

Dear ‹FirstName›,

We're ‹insert emotion› to announce our research is nearly complete. In just a few ‹random time duration›, we'll be announcing the new Company Name Telepathy Source and Destination, allowing the everyday man and woman to read minds into an SSIS data stream.

Imagine being able to:

Read the entire encyclopedia in a matter of minutes

Output your wife's thoughts to find out how she really feels

Learn a new skill in seconds like Neo from the Matrix

Over the past week we've run a contest to see who can be the first to view this amazing research and I'm happy to announce that Person's Name is our winner. If you are Person's Name, please click the below link to see our research. If you are not Person's First Name, please do not click below. We operate solely on the honor system at Company Name.

Person's Name Click Here

‹Emotional Stub›,

CEO's Name, Founder of Company Name

So, I don't know if it was meant to harvest contact information or install malware or what, but it certainly provided some much-needed ‹insert pleasant emotion› to the Data Driver. What's the clumsiest troll you've ever received? Comment here or drop me a line.

Posted by David Ramel on 10/05/2011 at 12:52 PM2 comments


Windows 8 Ups the Data Transfer Ante

Talk about driving data: the audience broke into applause at last week’s BUILD conference when some of the new blazing fast data transfer capabilities were demonstrated by Microsoft’s Bryon Surace during a keynote address.

“With Windows Server 8, we can use multiple NICs [network interface controllers] simultaneously to help improve throughput and fault tolerance,” Surace said.

To demonstrate the new speedy data-transfer capabilities, Surace used a server running Hyper-V with two virtual machines, one of which was connected to two disks. One disk was connected using a 1GB Ethernet connection, a setup he described as “very typical, very commonplace in today’s environment.”

The other disk was connected “using multiple high-speed NICs that are leveraging SMB 2.2 multi channel and RDMA [remote direct memory access].” Starting up a SQL load generator and going to a performance monitor, Surace pointed out how the 1GB Ethernet card was transferring data at less than 100MB/sec., which he said was “pretty typical.” The second disk, however, was transferring data at more than 2GB/sec. That’s when the applause broke out.

“Now, previously, these technologies were only available in high-performance computing, but now with Windows Server 8, we're building them for one of the most common roles in Windows, Surace said. He went on to show that the NIC wasn’t saturated, but rather was using only about 15 percent of the available throughput.

“This is a clear indication that we haven't even scratched the surface of what's possible with Windows Server 8,” he said. “And as we move over and take a look at the performance, we're only using about 1 percent of the CPU on the server to be able to push this throughput.”

Surace also demonstrated the simplified storage array management capabilities of Windows Server 8. For this, he used a server connected to 16 SSD hard drives, with no specialized controllers, “just a bunch of disks, or JBOD, directly connected to our server and being managed by Windows.” He noted how the disks were used to create a storage pool for which some space was carved out and represented as a drive on the server. He also showed file shares connected by the “improved SMB 2.2 protocol.”

“So, the key here is you don't need a PhD in storage, Surace said. “You can simply attach just a bunch of disks to Windows and have it all managed and deployed right there.”

The full keynote can be viewed via Microsoft’s Channel 9 video service.

What are the software development ramifications of the new Windows Server 8? Comment here or drop me a line.

Posted by David Ramel on 09/21/2011 at 12:52 PM0 comments