Posts Tagged ‘sql’

science Daily January 19th, 2010

due dateEven though it was buried in a chart, I considered the May 2010 due date for SQL Server 2008 R2 to be “official.”But on January 19,

Microsoft made it officially official.

Microsoft is confirming that the latest version of its database will be out “by May” and will be on the May price list.A new  posting on the Microsoft Data Platform Insider blog confirmed the May date.

According to that blog, there have been 150,000 downloads by testers of the R2 release.

The November Community Technology Preview (CTP) release was the last test build Microsoft is planning to issue for the product, officials said.

SQL Server 2008 R2, codenamed Kilimanjaro, will come in a number of new flavors, including a Datacenter edition and a Parallel Data Warehouse edition (formerly codenamed “Project Madison”).

The Datacenter edition builds on the SQL Server 2008 R2 Enterprise product, but adds application and multi-server management; virtualization; high-scale complex event processing (via StreamInsight); and supports more than 8 processors and up to 256 logical processors.

The Parallel Data Warehouse version will be sold preloaded on servers as a data warehouse appliance. Using the DataAllegro technology Microsoft acquired in 2008, it will scale customers’ data warehouses from the tens of terabytes, up to one petabyte plus range, according to the company.

All versions will be available commercially in May except Parallel Data Warehouse, a spokesperson said. All Microsoft will say there is it will be out in the first half of 2010 (so I guess that means it could be in June).

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.

On ScottGu’s Blog  24th jan 2010

Technical debates are discussed endlessly within the blog-o-sphere/twitter-verse, and they range across every developer community. Each language, framework, tool, and platform inevitably has at least a few going on at any particular point in time.

Below are a few observations I’ve made over the years about technical debates in general, as well as some comments about some of the recent discussions I’ve seen recently about the topic of ASP.NET Web Forms and ASP.NET MVC in particular.

General Observations About Technical Debates

Below are a few general observations independent of any specific technical debate:

a) Developers love to passionately debate and compare languages, frameworks, APIs, and tools.  This is true in every programming community (ASP.NET , Java, PHP, C++, Ruby, Python, etc).  I think you can view these types of religious technical debates in two ways:

  1. They are sometimes annoying and often a waste of time.
  2. They are often a sign of a healthy and active community (since passion means people care deeply on both sides of a debate, and is far better than apathy).

Personally I think both points are true.

b) There is never only “one right way” to develop something. As an opening interview question I sometimes ask people to sort an array of numbers in the most efficient way they can.  Most people don’t do well with it.  This is usually not because they don’t know sort algorithms, but rather because they never think to ask the scenarios and requirements behind it – which is critical to understanding the most efficient way to do it.  How big is the sequence of numbers? How random is the typical number sequence (is it sometimes already mostly sorted, how big is the spread of numbers, are the numbers all unique, do duplicates cluster together)? How parallel is the computer architecture?  Can you allocate memory as part of the sort or must it be constant?  Etc. These are important questions to ask because the most efficient and optimal way to sort an array of numbers depends on understanding the answers.

Whenever people assert that there is only “one right way” to a programming problem they are almost always assuming a fixed set of requirements/scenarios/inputs – which is rarely optimal for every scenario or every developer.  And to state the obvious – most problems in programming are far more complex than sorting an array of numbers.

c) Great developers using bad tools/frameworks can make great apps. Bad developers using great tools/frameworks can make bad apps. Be very careful about making broad assumptions (good or bad) about the quality of the app you are building based on the tools/frameworks used.

d) Developers (good and bad) can grow stronger by stretching themselves and learning new ideas and approaches.  Even if they ultimately don’t use something new directly, the act of learning it can sharpen them in positive ways.

e) Change is constant in the technology industry.  Change can be scary.  Whether you get overwhelmed by change, though, ultimately comes down to whether you let yourself be overwhelmed.  Don’t stress about having to stop and suddenly learn a bunch of new things – rarely do you have to. The best approach to avoid being overwhelmed is to be pragmatic, stay reasonably informed about a broad set of things at a high-level (not just technologies and tools but also methodologies), and have the confidence to know that if it is important to learn a new technology, then your existing development skills will mostly transition and help.  Syntax and APIs are rarely the most important thing anyway when it comes to development – problem solving, customer empathy/engagement, and the ability to stay focused and disciplined on a project are much more valuable.

f) Some guidance I occasionally give people on my team when working and communicating with others:

  1. You will rarely win a debate with someone by telling them that they are stupid – no matter how well intentioned or eloquent your explanation of their IQ problems might be.
  2. There will always be someone somewhere in the world who is smarter than you – don’t always assume that they aren’t in the room with you.
  3. People you interact with too often forget the praise you give them, and too often remember a past insult –  so be judicious in handing them out as they come back to haunt you later.
  4. People can and do change their minds – be open to being persuaded in a debate, and neither gloat nor hold it against someone else if they also change their minds.

g) I always find it somewhat ironic when I hear people complain about programming abstractions not being good.  Especially when these complaints are published via blogs – whose content is displayed using HTML, is styled with CSS, made interactive with JavaScript, transported over the wire using HTTP, and implemented on the server with apps written in higher-level languages, using object oriented garbage collected frameworks, running on top of either interpreted or JIT-compiled byte code runtimes, and which ultimately store the blog content and comments in relational databases ultimately accessed via SQL query strings.  All of this running within a VM on a hosted server – with the OS within the VM partitioning memory across kernel and user mode process boundaries, scheduling work using threads, raising device events using signals, and using an abstract storage API fo disk persistence.  It is worth keeping all of that in mind the next time you are reading a “ORM vs Stored Procedures” or “server controls – good/bad?” post.  The more interesting debates are about what the best abstractions are for a particular problem.

h) The history of programming debates is one long infinite loop – with most programming ideas having been solved multiple times before.  And for what it’s worth – many of the problems we debate today were long ago solved with LISP and Smalltalk.  Ironically, despite pioneering a number of things quite elegantly, these two languages tend not be used much anymore. Go figure.

a) Web Forms and MVC are two approaches for building ASP.NET apps. They are both good choices. Each can be the “best choice” for a particular solution depending on the requirements of the application and the background of the team members involved. You can build great apps with either.  You can build bad apps with either. You are not a good or bad developer depending on what you choose. You can be absolutely great or worthless using both.

b) The ASP.NET and Visual Studio teams are investing heavily in both Web Forms and MVC.  Neither is going away.  Both have major releases coming in the months ahead.  ASP.NET 4 includes major updates to Web Forms (clean ClientIDs and CSS based markup output, smaller ViewState, URL Routing, new data and charting controls, new dynamic data features, new SEO APIs, new VS designer and project improvements, etc, etc).  ASP.NET 4 will also ship with ASP.NET MVC 2 which also includes major updates (strongly typed helpers, model validation, areas, better scaffolding, Async support, more helper APIs, etc, etc).  Don’t angst about either being a dead-end or something you have to change to.  I suspect that long after we are all dead and gone there will be servers somewhere on the Internet still running both ASP.NET Web Forms and ASP.NET MVC based apps.

c) Web Forms and MVC share far more code/infrastructure/APIs than anyone on either side of any debate about them ever mentions – Authentication, Authorization, Membership, Roles, URL Routing, Caching, Session State, Profiles, Configuration, Compilation, .aspx pages, .master files, .ascx files, Global.asax, Request/Response/Cookie APIs, Health Monitoring, Process Model, Tracing, Deployment, AJAX, etc, etc, etc.  All of that common stuff you learn is equally valid regardless of how you construct your UI.  Going forward we’ll continue to invest heavily in building core ASP.NET features that work for both Web Forms and MVC (like the URL Routing, Deployment, Output Caching, and DataAnnotations for Validation features we are adding with ASP.NET 4).

d) I often find debates around programming model appropriateness and abstractions a little silly. Both Web Forms and MVC are programming web framework abstractions, built on top of a broader framework abstraction, programmed with higher level programming languages, running on top of a execution engine abstraction that itself is running on top of a giant abstraction called an OS.  What you are creating with each is HTML/CSS/JavaScript (all abstractions persisted as text, transmitted over HTTP – another higher level protocol abstraction).

The interesting question to debate is not whether abstractions are good or not – but rather which abstractions feels most natural to you, and which map best to the requirements/scenarios/developers of your project.

e) As part of that we will be posting more end to end tutorials/content (for both Web Forms and MVC).  We will also be providing tutorials and guidance that will help developers quickly evaluate both the Web Forms and MVC approach, easily learn the basics about how both work, and quickly determine which one feels best for them to use. This will make it easy for developers new to ASP.NET, as well as developers who already know either Web Forms or MVC, to understand and evaluate the two approaches and decide which they want to use.

f) Decide on a project about whether you want to use Web Forms or MVC and feel good about it.  Both can be good choices.  Respect the choices other people make – the choice they have made is also hopefully a good one that works well for them.  Keep in mind that in all likelihood they know a lot more about their own business/skills than you do.  Likewise you hopefully know a lot more about your own business/skills than they do.

g) Share ideas and best practices with others.  That is a big part of what blogs, forums, listservs and community is all about.  What makes them work great is when people know that their ideas aren’t going to be ripped to shreds, and that they will be treated with respect.  Be constructive, not snarky. Teach, don’t lecture. Remember there is always someone else out there who you can also learn from.

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.

On CRN.In, By Joseph F Kovar, ChannelWeb, January 13, 2010

VMware said it planned to purchase Zimbra from Yahoo, a move that gives the virtualization leader a strong, cloud-based, open-source collaboration suite with which it could attack rival Microsoft‘s Outlook and Exchange e-mail applications.

Financial details of the acquisition were not released. However, several speculations over the past week or so estimated the price to be about $100 million, much lower than the $350 million Yahoo paid when it bought Zimbra in 2007.

Zimbra is the developer of the open-source Zimbra Collaboration Suite, which includes applications to coordinate, manage, and share e-mails from multiple vendors, including Microsoft‘s Outlook, in a single interface; perform group scheduling; and handle desktop and mobile device synchronization.

The company currently serves 55 million mailboxes, with overall mailbox growth of 86 percent and SMB mailbox growth of 165 percent in 2009, VMware said.

The acquisition, once it closes, would be the second open-source acquisition for VMware.

The company in August acquired SpringSource, a developer of applications based on open-source technologies and a leader in such open-source communities as the enterprise Java programming model Spring Framework, the Apache Tomcat Java application server environment, and the Groovy and Grails dynamic language and Web application framework.

In a blog post on the acquisition, VMware CTO Steve Herrod wrote that Zimbra will help VMware enhance its cloud computing offerings in two ways.

First, Herrod wrote, it will help VMware simplify IT. Zimbra is the most popular software for developing virtual appliances, Herrod wrote. “Once deployed onto VMware vSphere, the Zimbra virtual appliance will automatically benefit from the built-in VMware vSphere scalability, availability, and security services,” he wrote.

The acquisition also lets VMware expand on its vCloud cloud computing technology and SpringSource platform-as-a-service capabilities by adding an integrated portfolio of applications, giving VMware a software-as-a-service offering.

The Zimbra Collaboration Suite also competes in some ways with some of arch rival Microsoft‘s key products, including Office, giving VMware another tool for competing with Microsoft.

However, Herrod wrote in his blog, VMware does not want to alienate Microsoft Office users from working with VMware’s vSphere virtualization technology, which competes with Microsoft’s Hyper-V

“VMware vSphere is and will continue to be an outstanding platform for the deployment of Microsoft Exchange. We have heavily optimized our virtualization offerings specifically for the deployment of Microsoft Exchange, and thousands of companies are benefiting from the increased flexibility, availability, and security that comes from running Microsoft Exchange on top of VMware vSphere,” he wrote.

VMware brings the opportunity to become more involved in cloud computing, wrote Jim Morrisroe, Vice President, Sales for Zimbra, in a blog on the acquisition.

“Private and/or public cloud computing networks can work together and applications can be deployed and managed seamlessly across those clouds. Zimbra products were designed from the ground up with virtualization and the cloud in mind, with a modular architecture and APIs to allow distributed access to data and storage,” Morrisroe wrote. “Email and collaboration services have always been ubiquitous to organizations, but now the barriers to transitioning them to efficient virtualized environments will be much more seamless.”

Synergetics is Awarded as the “Best. NET Training Service Provider” by   Microsoft.

By Paul Krill | InfoWorld

Project Kenai will still be used internally, but external users must move elsewhere as Oracle consolidates hosting sites

In the wake of its merger with Sun Microsystems, Oracle is discontinuing access to Project Kenai, which was developed by Sun as an open source project-hosting site.

Kenai, Oracle said in an updated FAQ statement for developers on Tuesday, will be discontinued for public use. “Oracle will continue to use it internally and look for ways that our customers can take advantage of it,” the Oracle FAQ said.

[InfoWorld’s Paul Krill reported last week that Oracle canceled plans for the Sun Cloud public cloud service announced by Sun last year. ]

The phasing-out is being done to consolidate project-hosting, according to the Project Kenai Team in a Web posting about the future of Kenai. “Minimizing the number of current project-hosting sites is a start in this direction,” the team said.

At the Kenai beta site, users were advised to being migrating repositories and content to other locations.

“The complete shutdown of the site and the removal of the domain will be completed in the next 60 days (April 2nd 2010). This should provide ample time for all projects to be moved to a new home of the project owners choice,” the Kenai team said.

“Any projects that remain after the 60 day limit (April 2nd 2010), will be removed when the site is turned off,” the team said..

“While it has come time to close the domain of Kenai.com, the infrastructure, which is already used under Netbeans.org, will live on to support other domains in the future,” the team said.

Oracle also lauded in the FAQ the combination of the OTN (Oracle Technology Network), the Sun BigAdmin system administration portal, and the Sun Developer Network, which includes the java.sun.com Web site.

This combination will “result in the largest and most diverse community of developers, database administrators, sysadmins, and architects,” Oracle said.

For the near future, Sun Developer Network and BigAdmin will remain in current forms, Oracle said. The company foresees an integration of these sites into a redesigned and re-architected OTN.

Also, Oracle plans to continue to offer certifications for Sun technologies including Java, SPARC, Solaris, and MySQL as part of the Oracle University program.

Oracle one week ago today detailed ambitious plans for its newly acquired Sun technologies.

This story, “Oracle shutting off Sun project-hosting site,” was originally published at InfoWorld.com. Follow the latest news in software development at InfoWorld.com.

Synergetics is a premium brand in the Indian IT industry in the area of  people competency development   engaged in delivering it thru  its training and consulting interventions; primarily focusing on their productivity with regards to the project and deliverables on hand  Its primary differentiator has been its solution centric approach  and its comprehensive client focused service portfolio.

By Kevin Fogarty, on 26th Jan 2010

CIO – IT people with skills and experience in server virtualization, cloud computing or both have a far greater chance of getting and keeping jobs than most other IT people now, according to recruiters and analysts. But what do you call these gurus? There’s no accepted standard for what to call either virtualization or cloud-computing specialists, so jobseekers will have to look for a range of keywords-and include those in their resumes-to find a match with particular employers, says Dice.com spokesperson Jennifer Bewley.

If you are searching for a virtualization or cloud role, watch your search terms, she says.

Just using “virtualization” as a keyword, for example, pulled up 880 jobs on Dice.com on one day last week, for example, according to Bewley.

“However, there are another 900 jobs that include ‘VMware’ as a keyword with no mention of virtualization,” Bewley found. “That leads us to conclude that searching based on vendor is particularly important in virtualization jobs.”

Common terms for virtualization specialists include: Architect SAN/Virtualization; Citrix / VMware specialist or administrator; Data Center Virtualization Systems Analyst; and Product Manager for Large Scale Virtualization.

Cloud Job Searching Tricky People looking for new jobs described using “cloud computing” may not be completely out of luck, but they’re not far off, according to M. Victor Janulaitis, CEO of IT job-market researchers Janco Associates, which published this month a survey of CIO hiring plans for 2010.

“I have seen some people just use ‘cloud specialist’ to describe themselves,” Janulaitis says. “There’s not really a set of terms yet that are common to refer to cloud computing skills-they just refer to them as architecture or infrastructure skill sets.”

[For timely cloud computing news and expert analysis, see CIO.com’s cloud computing Drilldown section. ]

Other people just add “cloud” or “virtualization” onto more common titles such as system administrator, systems engineer, architect, or network engineer, Bewley says.

Remember, it’s not unusual for employers or prospective employers to use the name of a particular vendor or new technology as a primary identifier in a job ad, especially if they’re looking for someone certified in that vendor’s technology, according to Tom Silver, senior vice president of tech-job ad site Dice North America.

Normally a company would look for technical skills in a particular technology, not just one vendor’s products, Janulaitis says.

“With the economy the way it is, and now people are talking about the possibility of a double-dip recession, a lot of companies are just looking for basic skills and experience,” Janulaitis says.

That means plugging holes where they have to-by hiring one person with experience managing VMWare servers, for example-or a lot of junior-level generalists who can be trained in the skills that company needs, Janulaitis says.

Virtualization Salaries Flattening? Salaries for virtualization specialists have also hit a plateau, though the ads for them increased 30 percent in 2009 compared to 2008, Silver says.

After surging 10 percent in 2008 and into 2009, salaries for virtualization experts were flat this year at an average of $84,777, Dice.com data shows.

That’s still a premium compared to the national average of $78,845 per year for other tech workers, however, Bewley says.

Company proposes ‘The Cloud Computing Advancement Act’ Microsoft Session at Cloud Expo

Today, Brad Smith, senior vice president and general counsel at Microsoft Corp., urged both Congress and the information technology industry to act now to ensure that the burgeoning era of cloud computing is guided by an international commitment to privacy, security and transparency for consumers, businesses and government.

During a keynote speech to the Brookings Institution policy forum, “Cloud Computing for Business and Society,” Smith also highlighted data from a survey commissioned by Microsoft measuring attitudes on cloud computing among business leaders and the general population.

The survey found that while 58 percent of the general population and 86 percent of senior business leaders are excited about the potential of Cloud Computing, more than 90 percent of these same people are concerned about the security, access and privacy of their own data in the cloud. In addition, the survey found that the majority of all audiences believe the U.S. government should establish laws, rules and policies for cloud computing.

At today’s event, Smith called for a national conversation about how to build confidence in the cloud and proposed the Cloud Computing Advancement Act to promote innovation, protect consumers and provide government with new tools to address the critical issues of data privacy and security. Smith also called for an international dialogue on data sovereignty to guarantee to users that their data is subject to the same rules and regulations, regardless of where the data resides.

“The PC revolution empowered individuals and democratized technology in new and profoundly important ways,” said Smith in his keynote address. “As we move to embrace the cloud, we should build on that success and preserve the personalization of technology by making sure privacy rights are preserved, data security is strengthened and an international understanding is developed about the governance of data when it crosses national borders.”

He continued, “ Microsoft is committed to fostering the responsible development of cloud computing to ensure that data is accessible, safe and secure. We also need government to modernize the laws, adapt them to the cloud, and adopt new measures to protect privacy and promote security. There is no doubt that the future holds even more opportunities than the present, but it also contains critical challenges that we must address now if we want to take full advantage of the potential of Cloud Computing.”

Microsoft and Intuit are going to join their clouds, Azure and the Intuit Partner Platform (IPP), so developers can deliver and market web applications to the 27 million QuickBooks-using small businesses through the

The integration also means that small businesses can use Microsoft ‘s cloud-based productivity applications via the Intuit App Center, presumably heading off some losses to Google Apps and Zoho.

The deal calls for Azure to be an Intuit preferred platform.

There’s a free Azure beta SDK that will federate applications developed on Azure with the go-to-market IPP already available at

Integration is based on an extension of the QuickBooks data model and will provide APIs for single sign-on, billing, data integration and user management.

The companies expect a flood of SaaS apps to follow since together they have some 750,000 development firms and channel partners.

Azure launched February 1. Later this year, after they get the integrate just right and widgetry’s formally out, Microsoft will make its Business Productivity Online Suite, including Exchange Online, SharePoint Online, Office Live Meeting and Office Communications Online, available for purchase in the Intuit App Center.

On CIOL

Ovum The next three years will see Cloud computing mature rapidly as vendors and enterprises come to grip with the opportunities and challenges that it represents

MELBOURNE, AUSTRALIA: Cloud computing, the most important trend for 2010 has barely even started, says Ovum, an analyst and consulting company. The next three years will see cloud computing mature rapidly as vendors and enterprises come to grip with the opportunities and challenges that it represents.

Cloud computing – Been There Done That

Some prefer to limit cloud computing to infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS), whilst others also consider software-as-a-service (SaaS) and private clouds part of the phenomenon.

A wider perspective helps understand one of the key trends in cloud computing – cloud computing will be hybrid. “Enterprises will mix and match public and private cloud elements with traditional hosting and outsourcing services to create solutions that fit short and long-term requirements”, comments Laurent Lachal, Senior Analyst.

“The past 18 months have seen a significant shift in focus away from public clouds towards private ones owing to a powerful mix of vendor push and user pull”, said Lachal based in London. The private cloud is, to a large extent, a re-badging of what data centre-focused hardware, software and service vendors have been doing under different names (such as utility computing, autonomic IT, on demand data centre etc.) for the past 10 years.

Many users are wary of public clouds’ quality of service in areas such as reliability, availability, scalability and security but curious about the possibility of adopting some of their characteristics (e.g. on demand instant provisioning of IT assets).

Private clouds are either defined as the aim of the data centre evolution journey (a long patient maturation process) or as shortcuts along the way that push parts of the data centre ahead to deliver focused return on investment (the private cloud is the part(s) of the data centre ahead of the rest).

What is needed is a way to reconcile the two approaches (private-cloud-as-a-journey and as-a-shortcut) to understand when, on the road towards a next generation data centres, should users take shortcuts. Unfortunately, most vendors currently emphasises the second approach rather than trying to reconcile the two.

Cloud computing promises to tackles two irreconcilable (so far) IT challenges – the need to lower costs and boost innovation – but it will take a lot of efforts from enterprises to actually make it work. Instead of a nimbler IT with their IT mess for less somewhere else, the ill-prepared will end up with their IT mess spread across a wider area”, said Lachal.

Lachal believes that adoption is a two-way street. “It is not just about whether cloud computing is ready for enterprises, it is, more importantly, whether or not enterprises are ready for it”, said Lachal, author of the report.  The fact is that many enterprises are currently not particularly ready for either private or public clouds or any type of hybrids in between.

Besides the current confusion as to what exactly Cloud computing is, many enterprises lack the knowledge, skills and metrics to figure out what is best for them. They need to be able figure out how to mix and match:

·Totally private and shared private clouds (to collaborate with partners on common goals).

·Public and private clouds, with public clouds used, for example, for workloads that have unpredictable spikes in their use, for application that are only occasionally used or to turn the pre-production infrastructure (used for test, migrations etc.) into production one and use public clouds instead (since pre-production tasks have much lower requirements in terms of quality of service than production ones).

·Public clouds and traditional hosting/outsourcing service offerings: for example hosted offerings are usually cheaper for static web sites than the Amazon IaaS service. On the other hand, for use such as application testing, where a handful of server is required for a few weeks and a few hours per day, Amazon IaaS is the answer.

·Pubic clouds offerings (IaaS, PaaS and SaaS), based on their respective cost effectiveness.

To do so, they need to improve their knowledge of which asset cost what in public and private clouds as well as traditional hosting/outsourcing service offerings as well as their ability to monitor, meter and bill usage.  Few enterprises can currently do so. Achieving all of this will take time and tears.