On SD Times,

LOS ANGELES – It’s Windows Azure, front and center at the Microsoft Professional Developers Conference, which began today in Los Angeles.

Ray Ozzie, Microsoft’s chief software architect, took the keynote stage to announce the pending availability of Windows Azure. He also announced two Azure services: a data catalog called “Dallas,” and a business application marketplace called “Pinpoint.”

Windows Azure will go into production on Jan. 1, 2010, Ozzie said. Meanwhile, Microsoft will continue to add new features, and select partners will begin hosting commercial applications on Azure immediately.

WordPress developer Automattic’s founder Matt Mullenweg joined Ozzie on stage to announce that WordPress has migrated to Windows Azure at an infrastructure-as-a-service level. Popular blogs, including “I Can Has Cheezburger?” and “FAIL Blog,” are now running on Windows Azure.

Dallas is an information service that Ozzie called a “game-changing subsystem.” Dallas, which is built on Windows Azure and SQL Azure, allows developers to access commercial and private data sets across platforms.

Pinpoint is a catalog of business applications and services that target developers and are integrated into the Azure development portal.

Dallas is seeded with trial data sets from media companies, the U.S. government and others. Data can be bound to applications via ATOM feeds and REST, said Microsoft technical fellow David Campbell.

Microsoft’s Server and Tools business president Bob Muglia described how the company was going to help developers structure applications for Azure, and to make applications portable to migrate from Windows data center to the cloud. “Developers have to evolve applications to take advantage of the cloud,” he said.

Building hybrid cloud applications is another focus of the cloud application model. To that end, Microsoft is working on “Sydney,” a project to connect private data centers with Windows Azure. No further details were provided.

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people competency development;  engaged in delivering it thru  its training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand . Its primary differentiator has been its solution centric approach and its comprehensive client focused service portfolio.

On CIOL News,

L&T Infotech, a global IT services provider, today announced that it has teamed with SEEBURGER Inc., a provider of global business integration solutions, for a strategic partnership in order to increase U.S. implementation resources for the latter’s electronic data interchange (EDI) and business-to-business integration (B2B) software.

With this, L&T Infotech would provide both sales and deployment services for the SEEBURGER Business Integration Server and associated solutions, said a press release.

L&T Infotech has nine U.S. offices with dedicated teams in key industry sectors with EDI/B2B needs, including technology, manufacturing, finance, healthcare and energy/petrochemicals. The firm has extensive SAP and Oracle expertise as well as B2B systems integration experience, making it possible to support customers who are deploying the SEEBURGER platform in conjunction with an update, migration or implementation of a new ERP system, according to the release.

“Much of our systems integration business is ERP-focused, and many of our ERP customers need B2B integration as well,” said Sudip Banerjee, CEO, L&T Infotech. “Adding SEEBURGER technology to our portfolio will allow us to serve that need with what we consider as a robust, advanced and scalable EDI/B2B platform,” he added.

“Partnering with L&T Infotech expands our services capacity and provides an additional expert implementation resource for our U.S. customers, particularly for crossover deployments involving tandem ERP/EDI upgrades,” said Wesley Thompson, VP of Business Development, SEEBURGER Inc.

SEEBURGER’s EDI/B2B solution suite includes multiple B2B gateways and related products for disparate enterprise needs, including specialized solutions that automate document exchange with non-EDI-enabled trading partners via e-mail, spoke units and partner portals, the release added.


Synergetics is Awarded as the “Best. NET Training Service Provider” by Microsoft.

AT&T to introduce new cloud computing service” on Siliconindia News Bureau

Global telecom company AT&T has expanded its portfolio of cloud-based services to include on-demand compute capacity.

The addition of Synaptic Compute as a Service offering strengthens AT&T’s position in competing with other large cloud-based services providers like Amazon Web Services, Microsoft and Google. The telecom company already offers cloud-based storage and hosting services.

“As companies increasingly move to cloud-based environments, AT&T Synaptic Compute as a Service provides a much-needed choice for IT executives who worry about over-building or under-investing in the capacity needed to handle their users’ traffic demands,” said Roman Pacewicz, Vice President of Strategy and Application Services, AT&T.

The service, expected to launch in the fourth quarter of 2009, will feature a Web-based interface, pay-as-you-go billing structure and multiple storage options for use with the existing Synaptic Storage offering. AT&T said that there will be no up-front fees, long-term obligations or early-termination penalties.

The company partnered with the leading virtualization software developer VMware and multi-faceted technology company Sun to develop its newest offering. The product is using VMware’s vSphere hypervisor and vCloud API.

The company will deploy the service in the U.S, but it will be accessible from anywhere through the internet. It Plans to expand the offering globally in the future.

Synergetics is Awarded as the “Best. NET Training Service Provider” by Microsoft.

We get latest news from SD Times, Telerik looks to boost 3D for Silverlight By Jeff Feinman

.NET component provider Telerik has brought a 3D charting engine for Microsoft Silverlight to the second-quarter release of its RadControls UI components and Web testing suite.

The company said that the charting engine, released today, can turn complex data into interactive, animated graphics.

“Telerik has built a 3D engine to render real 3D objects so they can be manipulated in 3D space, and it will be delivered for both Silverlight 2 and Silverlight 3,” said Todd Anglin, chief evangelist for Telerik.

Microsoft rolled out 3D capabilities in the 3.0 release of Silverlight, which was introduced and released in beta at the MIX 09 conference in March. Anglin claimed that Telerik’s 3D charting engine allows for the creation of richer, more complex graphics than what Silverlight developers might be used to with the Silverlight 3 beta.

Another main feature in the release that Anglin talked about is a free Web tester called the WebAii Testing Framework for RadControls. This lets developers simulate how a user interface works in ASP.NET and Silverlight.

“Today, a developer may open up a Web app, click through, and log in manually,” Anglin said. “Our testing framework allows them to program those steps and automate the process of manually clicking through your application to ensure the UI works the way you expect it to.”

The price of Telerik’s full RadControls suite is US$1,299 per developer seat.

This release also brings in Silverlight Scheduler, which uses scheduling features similar to those of Microsoft Outlook; and Visual Style Builder, which designs customized skins for RadControls for ASP.NET AJAX.

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people competency development;  engaged in delivering it thru  its training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand . Its primary differentiator has been its solution centric approach and its comprehensive client focused service portfolio.

 

On InfoWorld, Paul Krill Writes,

Windows Azure, Microsoft‘s fledgling cloud computing platform, is piquing the interest of IT specialists who see it as a potential solution for dealing with variable compute loads. But an uptick in deployments for Azure, which becomes a fee-based service early next year, could take a while, with customers still just evaluating the technology.

“We’d be targeting applications that have variable loads” for possible deployment on Azure, said David Collins, a system consultant at the Unum life insurance company. The company might find Azure useful for an enrollment application. “We have huge activity in November and December and then the rest of the year, it’s not so big,” Collins said. Unum, however, is not ready to use Azure, with Collins citing issues such as integrating Azure with IBM DB2 and Teradata systems.

“From a scale-out perspective and for the future, it’s kind of interesting to hear” about Azure, said Michael Tai, director of development at Classified Ventures. But his company is probably not looking to use Azure in the short term, he said.

Meanwhile, an advertising agency that has done ads for Windows 7 already has used Azure. An official of that company also cited benefits in offloading of compute cycles to the cloud. “We’ve used Azure on a couple of projects already and had great success with it,” said Matthew Ray, technical director at Crispin Porter + Bogusky. “I think what helps us is we don’t have all the time and money” to build huge server clusters for projects that get a lot of traffic but only live for a month, Ray said. Using traditional platforms, “you can spend inordinate amounts of money — hundreds of thousands of dollars — to support something like the Super Bowl, something like that, and you’re done in a day, basically,” he said.

Microsoft has improved Azure since the last time the company looked at it. “It wasn’t as rich as it looks now,” said Sean Gordon, an architect in the strategy architecture emerging technology team at Chevron. “We’re looking at offloading compute resources, potentially, into the cloud,” he noted.

A Microsoft SharePoint software vendor sees Azure‘s potential for purposes such as extranets. “A lot of applications I can see being extended to the cloud,” said Stephen Cawood, community director at Metalogix. “For big companies, they’re still going to want to have their own datacenters and host things like SharePoint, but I can see them using cloud computing possibly for extranet scenarios where they’re working with partners or even customers.”

David Nahooray, software developer for the Organization for Economic Cooperation and Development (OECD), an international intergovernmental agency, said any decision to go to the cloud would be made a higher level. “[Azure] looks interesting, but it’s probably up to my boss to decide if we can go and put stuff outside in the cloud,” Nahooray said. Data such as economic indicators could be deployed on Azure for access by other organizations, he said.

Cloud computing is here. Running applications on machines in an Internet-accessible data center can bring plenty of advantages. Yet wherever they run, applications are built on some kind of platform. For on-premises applications, this platform usually includes an operating system, some way to store data, and perhaps more. Applications running in the cloud need a similar foundation. The goal of Microsoft’s Windows Azure is to provide this. Part of the larger Azure Services Platform, Windows Azure is a platform for running Windows applications and storing data in the cloud.

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. To deploy a new solution, most of your time and energy is spent on defining the right infrastructure, hardware and software, to put together to create that solution, cloud computing allows people to share resources to solve new problems. cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they use.

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people   competency development;   engaged in delivering it thru  its   training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand.Its primary differentiator has been its solution centric approach and its comprehensive client focused    service portfolio.

From ITVN Network, Autodesk Announces Support for Windows 7,

Autodesk, 2D and 3D design, engineering and entertainment Software Company, has announced support for Windows 7. Autodesk will support customers using nine products including AutoCAD 2010, AutoCAD LT 2010 and the Autodesk Inventor 2010 family of software, on Windows 7.

Windows 7 was designed with the customer in mind, said Mark Relph, senior director for Windows Product Management at Microsoft Corp. We are pleased to have the support of Autodesk to offer our mutual customers an easy way to do the things they want on a PC.For the 2010 product line, Autodesk will support nine products on Windows 7.  These products are Autodesk Inventor 2010, Autodesk Inventor LT 2010, AutoCAD 2010, AutoCAD LT 2010, AutoCAD Architecture 2010, AutoCAD Electrical 2010, AutoCAD Mechanical 2010, AutoCAD MEP 2010 and Autodesk Algor Simulation 2010 software.  Autodesk will support Windows 7 for most of its other products as updated versions are released.

Autodesk is committed to delivering software that meets and exceeds our customers functionality requirements, while providing them the broadest possible choice of operating systems, said Chris Bradshaw, Autodesk chief marketing officer. Autodesk and Microsoft share a goal of making the tasks our customers perform every day easier and faster. We look forward to extending support for Windows 7 across our portfolio so all our customers can benefit from the improvements in this robust new operating system.

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people   competency development;   engaged in delivering it thru  its   training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand . Its primary differentiator has been its solution centric approach and its comprehensive client focused    service portfolio.

science Daily January 19th, 2010

due dateEven though it was buried in a chart, I considered the May 2010 due date for SQL Server 2008 R2 to be “official.”But on January 19,

Microsoft made it officially official.

Microsoft is confirming that the latest version of its database will be out “by May” and will be on the May price list.A new  posting on the Microsoft Data Platform Insider blog confirmed the May date.

According to that blog, there have been 150,000 downloads by testers of the R2 release.

The November Community Technology Preview (CTP) release was the last test build Microsoft is planning to issue for the product, officials said.

SQL Server 2008 R2, codenamed Kilimanjaro, will come in a number of new flavors, including a Datacenter edition and a Parallel Data Warehouse edition (formerly codenamed “Project Madison”).

The Datacenter edition builds on the SQL Server 2008 R2 Enterprise product, but adds application and multi-server management; virtualization; high-scale complex event processing (via StreamInsight); and supports more than 8 processors and up to 256 logical processors.

The Parallel Data Warehouse version will be sold preloaded on servers as a data warehouse appliance. Using the DataAllegro technology Microsoft acquired in 2008, it will scale customers’ data warehouses from the tens of terabytes, up to one petabyte plus range, according to the company.

All versions will be available commercially in May except Parallel Data Warehouse, a spokesperson said. All Microsoft will say there is it will be out in the first half of 2010 (so I guess that means it could be in June).

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.

By David Worthington, January 18, 2010

New application life-cycle management (ALM) and testing features found in Visual Studio Team System 2010 are broadening the number of Microsoft partners that are building tools and services for the platform.

For ALM, Microsoft introduced architectural tools for model-driven development, a new approach to how Team Foundation Server (TFS) handles hierarchal work items and work item linking. The company also modified Visual Studio’s licensing scheme to broaden customers’ access to Visual Studio’s ALM tools.

“Work Items can have parents and children, and work item links can have types (e.g., a bug can be linked to a test by a ‘tested by’ link type),” said Terry Clancy, business development manager for Microsoft’s developer tools ecosystem.

Those changes, combined with increased use of Visual Studio Team System and TFS, have led to an increase in requests from tool ISVs to integrate with Visual Studio, he said.

Third parties are offering a crop of new process templates for Visual Studio Team 2010 , Clancy said. “Process templates are an emerging space and a big ALM thing. The [Visual Studio] ALM ecosystem will bear fruit in 2010.”

Those third parties include EMC Consulting, which is releasing a new edition of Scrum for Team System; Object Consulting, which develops Process Mentor, a suite of modifiable process templates for TFS; and Ivar Jacobson Consulting, which produces a process template for Essential Unified Process.

“VS 2010 is a far more sophisticated product, which enabled us to build a far more sophisticated template,” said EMC advisory practice consultant Simon Bennett.

Microsoft is also seeing a rise in requirements management solutions from companies like eDev, IBM Telelogic, Personify Design and Ravenflow, Clancy said.

Visual Studio’s new testing features are manual testing support, lab environment management with automated test deployment to virtual machines, test case management, and UI test automation.

Odin Technology, an automated software testing company, is a new Microsoft partner that previously only supported IBM Rational and HP Mercury products, Clancy said.

“Axe is in use in a number of high-profile companies across the globe, albeit producing code for tools from other vendors, e.g. HP, IBM, Micro Focus,” said Duncan Brigginshaw, owner and director of Odin.

“We’re aiming to ship with VS 2010 for use with their new testing and coded UI features. We think MS has a different and exciting slant on the [test] market.”

Fortify Software, Micro Focus, Quest Software and most of Microsoft‘s component vendors (with components that work with record and playback) are among other partners producing testing tools for VS 2010, Clancy said.

There are also one or two partners that will be integrating IntelliTrace, a historical debugging feature introduced in VS 2010, into their monitoring and diagnostics products, he added.

The very nature of Microsoft’s enhancements, such as build and release management tool integrations, custom reports, and custom templates, creates a larger “surface of engagement,” said Forrester principal analyst Jeffrey Hammond.

“Not lost on these folks is that .NET developers still seem willing to pay for development tools and services, something that’s not always the case for other application platforms.

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.

Cloud computing won’t have as much value unless we get the data-integration mechanisms right

In a recent InfoWorld article by Paul Krill, Vint Cerf, who is a co-designer of the Internet’s TCP/IP standards and widely considered a father of the Internet, spoke about the the need for data portability standards for cloud computing. “There are different clouds from companies such as Microsoft, Amazon, IBM, and Google, but a lack of interoperability between them,” Cerf explained at a session of the Churchill Club business and technology organization in Menlo Park, Calif.

Interoperability has not been a huge focus around the quickly emerging cloud computing space. Other than “we support interoperability” statements from the larger cloud computing providers, there is not a detailed plan to be seen. I’ve brought it up several times at cloud user group meetings, with clients, and at vendor briefings, and I often feel like I’m the kid in class who reminds the teacher to assign homework.

[Get the no-nonsense explanations and advice you need to take real advantage of cloud computing in the InfoWorld editors’ 21-page cloud computing Deep Dive PDF special report, featuring an exclusive excerpt from David Linthicum’s new book on cloud architecture. | Stay up on the cloud with InfoWorld’s cloud computing Report newsletter. ]

Data interoperability is not that hard. You’re dealing with a few key concepts, such as semantic interoperability, or the way that data is defined and stored on one cloud versus another. Also, you need to consider the notions of transformation and translation, so the data appears native when it arrives at the target cloud, or clouds, from the source cloud (or clouds). Don’t forget to add data governance and data security to the mix; you’ll need those as well.

There has been some talk of concepts such as the Intercloud, or a data exchange system running between major cloud computing providers. Also, a few cloud standards organizations, such as the Open Cloud Consortium, are looking to drive some interoperability standards, including a group working on standards and interoperability for “large data clouds.”

So how do we get down the path to data interoperability for the clouds? Don’t create yet another standards organization to look at this by committee. They take too long, and this is something that’s needed in 2010 to drive cloud computing adoption. Instead, the larger cloud computing providers should focus on this behind the scenes and create a working standard enabling technology to solve the data interoperability problem. If the larger providers are all on the same page, believe me, the smaller providers will quickly follow.

This article, “The data interoperability challenge for cloud computing,” was originally published at InfoWorld.com. Follow the latest developments on cloud computing at InfoWorld.com.

cloud computing is here. Running applications on machines in an Internet-accessible data center can bring plenty of advantages. Yet wherever they run, applications are built on some kind of platform. For on-premises applications, this platform usually includes an operating system, some way to store data, and perhaps more. Applications running in the cloud need a similar foundation. The goal of Microsoft’s Windows Azure is to provide this. Part of the larger Azure Services Platform, Windows Azure is a platform for running Windows applications and storing data in the cloud.

cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. To deploy a new solution, most of your time and energy is spent on defining the right infrastructure, hardware and software, to put together to create that solution, Cloud computing allows people to share resources to solve new problems. cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they use.

ACTIONABLE POINTS THAT CAN BE USED TO MAKE ONESELF A BETTER PROFESSIONAL AND MARKETABLE FOR THE FUTURE: Become a VERSATILIST

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people   competency development;   engaged in delivering it thru  its   training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand .     Its primary differentiator has been its  solution centric approach and its comprehensive client focused service portfolio.

On ScottGu’s Blog  24th jan 2010

Technical debates are discussed endlessly within the blog-o-sphere/twitter-verse, and they range across every developer community. Each language, framework, tool, and platform inevitably has at least a few going on at any particular point in time.

Below are a few observations I’ve made over the years about technical debates in general, as well as some comments about some of the recent discussions I’ve seen recently about the topic of ASP.NET Web Forms and ASP.NET MVC in particular.

General Observations About Technical Debates

Below are a few general observations independent of any specific technical debate:

a) Developers love to passionately debate and compare languages, frameworks, APIs, and tools.  This is true in every programming community (ASP.NET , Java, PHP, C++, Ruby, Python, etc).  I think you can view these types of religious technical debates in two ways:

  1. They are sometimes annoying and often a waste of time.
  2. They are often a sign of a healthy and active community (since passion means people care deeply on both sides of a debate, and is far better than apathy).

Personally I think both points are true.

b) There is never only “one right way” to develop something. As an opening interview question I sometimes ask people to sort an array of numbers in the most efficient way they can.  Most people don’t do well with it.  This is usually not because they don’t know sort algorithms, but rather because they never think to ask the scenarios and requirements behind it – which is critical to understanding the most efficient way to do it.  How big is the sequence of numbers? How random is the typical number sequence (is it sometimes already mostly sorted, how big is the spread of numbers, are the numbers all unique, do duplicates cluster together)? How parallel is the computer architecture?  Can you allocate memory as part of the sort or must it be constant?  Etc. These are important questions to ask because the most efficient and optimal way to sort an array of numbers depends on understanding the answers.

Whenever people assert that there is only “one right way” to a programming problem they are almost always assuming a fixed set of requirements/scenarios/inputs – which is rarely optimal for every scenario or every developer.  And to state the obvious – most problems in programming are far more complex than sorting an array of numbers.

c) Great developers using bad tools/frameworks can make great apps. Bad developers using great tools/frameworks can make bad apps. Be very careful about making broad assumptions (good or bad) about the quality of the app you are building based on the tools/frameworks used.

d) Developers (good and bad) can grow stronger by stretching themselves and learning new ideas and approaches.  Even if they ultimately don’t use something new directly, the act of learning it can sharpen them in positive ways.

e) Change is constant in the technology industry.  Change can be scary.  Whether you get overwhelmed by change, though, ultimately comes down to whether you let yourself be overwhelmed.  Don’t stress about having to stop and suddenly learn a bunch of new things – rarely do you have to. The best approach to avoid being overwhelmed is to be pragmatic, stay reasonably informed about a broad set of things at a high-level (not just technologies and tools but also methodologies), and have the confidence to know that if it is important to learn a new technology, then your existing development skills will mostly transition and help.  Syntax and APIs are rarely the most important thing anyway when it comes to development – problem solving, customer empathy/engagement, and the ability to stay focused and disciplined on a project are much more valuable.

f) Some guidance I occasionally give people on my team when working and communicating with others:

  1. You will rarely win a debate with someone by telling them that they are stupid – no matter how well intentioned or eloquent your explanation of their IQ problems might be.
  2. There will always be someone somewhere in the world who is smarter than you – don’t always assume that they aren’t in the room with you.
  3. People you interact with too often forget the praise you give them, and too often remember a past insult –  so be judicious in handing them out as they come back to haunt you later.
  4. People can and do change their minds – be open to being persuaded in a debate, and neither gloat nor hold it against someone else if they also change their minds.

g) I always find it somewhat ironic when I hear people complain about programming abstractions not being good.  Especially when these complaints are published via blogs – whose content is displayed using HTML, is styled with CSS, made interactive with JavaScript, transported over the wire using HTTP, and implemented on the server with apps written in higher-level languages, using object oriented garbage collected frameworks, running on top of either interpreted or JIT-compiled byte code runtimes, and which ultimately store the blog content and comments in relational databases ultimately accessed via SQL query strings.  All of this running within a VM on a hosted server – with the OS within the VM partitioning memory across kernel and user mode process boundaries, scheduling work using threads, raising device events using signals, and using an abstract storage API fo disk persistence.  It is worth keeping all of that in mind the next time you are reading a “ORM vs Stored Procedures” or “server controls – good/bad?” post.  The more interesting debates are about what the best abstractions are for a particular problem.

h) The history of programming debates is one long infinite loop – with most programming ideas having been solved multiple times before.  And for what it’s worth – many of the problems we debate today were long ago solved with LISP and Smalltalk.  Ironically, despite pioneering a number of things quite elegantly, these two languages tend not be used much anymore. Go figure.

a) Web Forms and MVC are two approaches for building ASP.NET apps. They are both good choices. Each can be the “best choice” for a particular solution depending on the requirements of the application and the background of the team members involved. You can build great apps with either.  You can build bad apps with either. You are not a good or bad developer depending on what you choose. You can be absolutely great or worthless using both.

b) The ASP.NET and Visual Studio teams are investing heavily in both Web Forms and MVC.  Neither is going away.  Both have major releases coming in the months ahead.  ASP.NET 4 includes major updates to Web Forms (clean ClientIDs and CSS based markup output, smaller ViewState, URL Routing, new data and charting controls, new dynamic data features, new SEO APIs, new VS designer and project improvements, etc, etc).  ASP.NET 4 will also ship with ASP.NET MVC 2 which also includes major updates (strongly typed helpers, model validation, areas, better scaffolding, Async support, more helper APIs, etc, etc).  Don’t angst about either being a dead-end or something you have to change to.  I suspect that long after we are all dead and gone there will be servers somewhere on the Internet still running both ASP.NET Web Forms and ASP.NET MVC based apps.

c) Web Forms and MVC share far more code/infrastructure/APIs than anyone on either side of any debate about them ever mentions – Authentication, Authorization, Membership, Roles, URL Routing, Caching, Session State, Profiles, Configuration, Compilation, .aspx pages, .master files, .ascx files, Global.asax, Request/Response/Cookie APIs, Health Monitoring, Process Model, Tracing, Deployment, AJAX, etc, etc, etc.  All of that common stuff you learn is equally valid regardless of how you construct your UI.  Going forward we’ll continue to invest heavily in building core ASP.NET features that work for both Web Forms and MVC (like the URL Routing, Deployment, Output Caching, and DataAnnotations for Validation features we are adding with ASP.NET 4).

d) I often find debates around programming model appropriateness and abstractions a little silly. Both Web Forms and MVC are programming web framework abstractions, built on top of a broader framework abstraction, programmed with higher level programming languages, running on top of a execution engine abstraction that itself is running on top of a giant abstraction called an OS.  What you are creating with each is HTML/CSS/JavaScript (all abstractions persisted as text, transmitted over HTTP – another higher level protocol abstraction).

The interesting question to debate is not whether abstractions are good or not – but rather which abstractions feels most natural to you, and which map best to the requirements/scenarios/developers of your project.

e) As part of that we will be posting more end to end tutorials/content (for both Web Forms and MVC).  We will also be providing tutorials and guidance that will help developers quickly evaluate both the Web Forms and MVC approach, easily learn the basics about how both work, and quickly determine which one feels best for them to use. This will make it easy for developers new to ASP.NET, as well as developers who already know either Web Forms or MVC, to understand and evaluate the two approaches and decide which they want to use.

f) Decide on a project about whether you want to use Web Forms or MVC and feel good about it.  Both can be good choices.  Respect the choices other people make – the choice they have made is also hopefully a good one that works well for them.  Keep in mind that in all likelihood they know a lot more about their own business/skills than you do.  Likewise you hopefully know a lot more about your own business/skills than they do.

g) Share ideas and best practices with others.  That is a big part of what blogs, forums, listservs and community is all about.  What makes them work great is when people know that their ideas aren’t going to be ripped to shreds, and that they will be treated with respect.  Be constructive, not snarky. Teach, don’t lecture. Remember there is always someone else out there who you can also learn from.

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.