Posts Tagged ‘oracle’

science Daily January 19th, 2010

due dateEven though it was buried in a chart, I considered the May 2010 due date for SQL Server 2008 R2 to be “official.”But on January 19,

Microsoft made it officially official.

Microsoft is confirming that the latest version of its database will be out “by May” and will be on the May price list.A new  posting on the Microsoft Data Platform Insider blog confirmed the May date.

According to that blog, there have been 150,000 downloads by testers of the R2 release.

The November Community Technology Preview (CTP) release was the last test build Microsoft is planning to issue for the product, officials said.

SQL Server 2008 R2, codenamed Kilimanjaro, will come in a number of new flavors, including a Datacenter edition and a Parallel Data Warehouse edition (formerly codenamed “Project Madison”).

The Datacenter edition builds on the SQL Server 2008 R2 Enterprise product, but adds application and multi-server management; virtualization; high-scale complex event processing (via StreamInsight); and supports more than 8 processors and up to 256 logical processors.

The Parallel Data Warehouse version will be sold preloaded on servers as a data warehouse appliance. Using the DataAllegro technology Microsoft acquired in 2008, it will scale customers’ data warehouses from the tens of terabytes, up to one petabyte plus range, according to the company.

All versions will be available commercially in May except Parallel Data Warehouse, a spokesperson said. All Microsoft will say there is it will be out in the first half of 2010 (so I guess that means it could be in June).

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.

On ScottGu’s Blog  24th jan 2010

Technical debates are discussed endlessly within the blog-o-sphere/twitter-verse, and they range across every developer community. Each language, framework, tool, and platform inevitably has at least a few going on at any particular point in time.

Below are a few observations I’ve made over the years about technical debates in general, as well as some comments about some of the recent discussions I’ve seen recently about the topic of ASP.NET Web Forms and ASP.NET MVC in particular.

General Observations About Technical Debates

Below are a few general observations independent of any specific technical debate:

a) Developers love to passionately debate and compare languages, frameworks, APIs, and tools.  This is true in every programming community (ASP.NET , Java, PHP, C++, Ruby, Python, etc).  I think you can view these types of religious technical debates in two ways:

  1. They are sometimes annoying and often a waste of time.
  2. They are often a sign of a healthy and active community (since passion means people care deeply on both sides of a debate, and is far better than apathy).

Personally I think both points are true.

b) There is never only “one right way” to develop something. As an opening interview question I sometimes ask people to sort an array of numbers in the most efficient way they can.  Most people don’t do well with it.  This is usually not because they don’t know sort algorithms, but rather because they never think to ask the scenarios and requirements behind it – which is critical to understanding the most efficient way to do it.  How big is the sequence of numbers? How random is the typical number sequence (is it sometimes already mostly sorted, how big is the spread of numbers, are the numbers all unique, do duplicates cluster together)? How parallel is the computer architecture?  Can you allocate memory as part of the sort or must it be constant?  Etc. These are important questions to ask because the most efficient and optimal way to sort an array of numbers depends on understanding the answers.

Whenever people assert that there is only “one right way” to a programming problem they are almost always assuming a fixed set of requirements/scenarios/inputs – which is rarely optimal for every scenario or every developer.  And to state the obvious – most problems in programming are far more complex than sorting an array of numbers.

c) Great developers using bad tools/frameworks can make great apps. Bad developers using great tools/frameworks can make bad apps. Be very careful about making broad assumptions (good or bad) about the quality of the app you are building based on the tools/frameworks used.

d) Developers (good and bad) can grow stronger by stretching themselves and learning new ideas and approaches.  Even if they ultimately don’t use something new directly, the act of learning it can sharpen them in positive ways.

e) Change is constant in the technology industry.  Change can be scary.  Whether you get overwhelmed by change, though, ultimately comes down to whether you let yourself be overwhelmed.  Don’t stress about having to stop and suddenly learn a bunch of new things – rarely do you have to. The best approach to avoid being overwhelmed is to be pragmatic, stay reasonably informed about a broad set of things at a high-level (not just technologies and tools but also methodologies), and have the confidence to know that if it is important to learn a new technology, then your existing development skills will mostly transition and help.  Syntax and APIs are rarely the most important thing anyway when it comes to development – problem solving, customer empathy/engagement, and the ability to stay focused and disciplined on a project are much more valuable.

f) Some guidance I occasionally give people on my team when working and communicating with others:

  1. You will rarely win a debate with someone by telling them that they are stupid – no matter how well intentioned or eloquent your explanation of their IQ problems might be.
  2. There will always be someone somewhere in the world who is smarter than you – don’t always assume that they aren’t in the room with you.
  3. People you interact with too often forget the praise you give them, and too often remember a past insult –  so be judicious in handing them out as they come back to haunt you later.
  4. People can and do change their minds – be open to being persuaded in a debate, and neither gloat nor hold it against someone else if they also change their minds.

g) I always find it somewhat ironic when I hear people complain about programming abstractions not being good.  Especially when these complaints are published via blogs – whose content is displayed using HTML, is styled with CSS, made interactive with JavaScript, transported over the wire using HTTP, and implemented on the server with apps written in higher-level languages, using object oriented garbage collected frameworks, running on top of either interpreted or JIT-compiled byte code runtimes, and which ultimately store the blog content and comments in relational databases ultimately accessed via SQL query strings.  All of this running within a VM on a hosted server – with the OS within the VM partitioning memory across kernel and user mode process boundaries, scheduling work using threads, raising device events using signals, and using an abstract storage API fo disk persistence.  It is worth keeping all of that in mind the next time you are reading a “ORM vs Stored Procedures” or “server controls – good/bad?” post.  The more interesting debates are about what the best abstractions are for a particular problem.

h) The history of programming debates is one long infinite loop – with most programming ideas having been solved multiple times before.  And for what it’s worth – many of the problems we debate today were long ago solved with LISP and Smalltalk.  Ironically, despite pioneering a number of things quite elegantly, these two languages tend not be used much anymore. Go figure.

a) Web Forms and MVC are two approaches for building ASP.NET apps. They are both good choices. Each can be the “best choice” for a particular solution depending on the requirements of the application and the background of the team members involved. You can build great apps with either.  You can build bad apps with either. You are not a good or bad developer depending on what you choose. You can be absolutely great or worthless using both.

b) The ASP.NET and Visual Studio teams are investing heavily in both Web Forms and MVC.  Neither is going away.  Both have major releases coming in the months ahead.  ASP.NET 4 includes major updates to Web Forms (clean ClientIDs and CSS based markup output, smaller ViewState, URL Routing, new data and charting controls, new dynamic data features, new SEO APIs, new VS designer and project improvements, etc, etc).  ASP.NET 4 will also ship with ASP.NET MVC 2 which also includes major updates (strongly typed helpers, model validation, areas, better scaffolding, Async support, more helper APIs, etc, etc).  Don’t angst about either being a dead-end or something you have to change to.  I suspect that long after we are all dead and gone there will be servers somewhere on the Internet still running both ASP.NET Web Forms and ASP.NET MVC based apps.

c) Web Forms and MVC share far more code/infrastructure/APIs than anyone on either side of any debate about them ever mentions – Authentication, Authorization, Membership, Roles, URL Routing, Caching, Session State, Profiles, Configuration, Compilation, .aspx pages, .master files, .ascx files, Global.asax, Request/Response/Cookie APIs, Health Monitoring, Process Model, Tracing, Deployment, AJAX, etc, etc, etc.  All of that common stuff you learn is equally valid regardless of how you construct your UI.  Going forward we’ll continue to invest heavily in building core ASP.NET features that work for both Web Forms and MVC (like the URL Routing, Deployment, Output Caching, and DataAnnotations for Validation features we are adding with ASP.NET 4).

d) I often find debates around programming model appropriateness and abstractions a little silly. Both Web Forms and MVC are programming web framework abstractions, built on top of a broader framework abstraction, programmed with higher level programming languages, running on top of a execution engine abstraction that itself is running on top of a giant abstraction called an OS.  What you are creating with each is HTML/CSS/JavaScript (all abstractions persisted as text, transmitted over HTTP – another higher level protocol abstraction).

The interesting question to debate is not whether abstractions are good or not – but rather which abstractions feels most natural to you, and which map best to the requirements/scenarios/developers of your project.

e) As part of that we will be posting more end to end tutorials/content (for both Web Forms and MVC).  We will also be providing tutorials and guidance that will help developers quickly evaluate both the Web Forms and MVC approach, easily learn the basics about how both work, and quickly determine which one feels best for them to use. This will make it easy for developers new to ASP.NET, as well as developers who already know either Web Forms or MVC, to understand and evaluate the two approaches and decide which they want to use.

f) Decide on a project about whether you want to use Web Forms or MVC and feel good about it.  Both can be good choices.  Respect the choices other people make – the choice they have made is also hopefully a good one that works well for them.  Keep in mind that in all likelihood they know a lot more about their own business/skills than you do.  Likewise you hopefully know a lot more about your own business/skills than they do.

g) Share ideas and best practices with others.  That is a big part of what blogs, forums, listservs and community is all about.  What makes them work great is when people know that their ideas aren’t going to be ripped to shreds, and that they will be treated with respect.  Be constructive, not snarky. Teach, don’t lecture. Remember there is always someone else out there who you can also learn from.

Synergetics is Awarded as the “Best. NET Training Service Provider” by    Microsoft.

By Paul Krill | InfoWorld

Project Kenai will still be used internally, but external users must move elsewhere as Oracle consolidates hosting sites

In the wake of its merger with Sun Microsystems, Oracle is discontinuing access to Project Kenai, which was developed by Sun as an open source project-hosting site.

Kenai, Oracle said in an updated FAQ statement for developers on Tuesday, will be discontinued for public use. “Oracle will continue to use it internally and look for ways that our customers can take advantage of it,” the Oracle FAQ said.

[InfoWorld’s Paul Krill reported last week that Oracle canceled plans for the Sun Cloud public cloud service announced by Sun last year. ]

The phasing-out is being done to consolidate project-hosting, according to the Project Kenai Team in a Web posting about the future of Kenai. “Minimizing the number of current project-hosting sites is a start in this direction,” the team said.

At the Kenai beta site, users were advised to being migrating repositories and content to other locations.

“The complete shutdown of the site and the removal of the domain will be completed in the next 60 days (April 2nd 2010). This should provide ample time for all projects to be moved to a new home of the project owners choice,” the Kenai team said.

“Any projects that remain after the 60 day limit (April 2nd 2010), will be removed when the site is turned off,” the team said..

“While it has come time to close the domain of Kenai.com, the infrastructure, which is already used under Netbeans.org, will live on to support other domains in the future,” the team said.

Oracle also lauded in the FAQ the combination of the OTN (Oracle Technology Network), the Sun BigAdmin system administration portal, and the Sun Developer Network, which includes the java.sun.com Web site.

This combination will “result in the largest and most diverse community of developers, database administrators, sysadmins, and architects,” Oracle said.

For the near future, Sun Developer Network and BigAdmin will remain in current forms, Oracle said. The company foresees an integration of these sites into a redesigned and re-architected OTN.

Also, Oracle plans to continue to offer certifications for Sun technologies including Java, SPARC, Solaris, and MySQL as part of the Oracle University program.

Oracle one week ago today detailed ambitious plans for its newly acquired Sun technologies.

This story, “Oracle shutting off Sun project-hosting site,” was originally published at InfoWorld.com. Follow the latest news in software development at InfoWorld.com.

Synergetics is a premium brand in the Indian IT industry in the area of  people competency development   engaged in delivering it thru  its training and consulting interventions; primarily focusing on their productivity with regards to the project and deliverables on hand  Its primary differentiator has been its solution centric approach  and its comprehensive client focused service portfolio.

Oracle on Monday fattened up its already burgeoning middleware stack, announcing Monday that it has purchased SOA (service oriented architecture) management vendor AmberPoint. Terms were not disclosed.

SOA refers to a systems design approach that eschews monolithic applications and instead designates various processes, such as running a credit check on a customer, as interoperable “services” that allow code to be flexibly reused.

AmberPoint’s software is used to monitor the performance of SOA-driven applications and help users solve problems. It is “highly complementary” to Oracle‘s own SOA software and will “enable increased control and performance of critical applications across the enterprise,” according to an FAQ document Oracle released Monday (PDF).

It is not clear how the deal will affect road maps for AmberPoint’s products. A review is under way and more details will be forthcoming, Oracle said. Investment in the products is expected to increase, according to the FAQ.

“AmberPoint was one of a dwindling group of still-standing independents delivering runtime governance for SOA environments,” analyst Tony Baer said on the OnStrategies Perspective blog.

The move “patches some gaps in its Enterprise Manager offering, not only in SOA runtime governance, but also with business transaction management — and potentially — better visibility to non-Oracle systems,” he added.

The deal is expected to close in the first half of this year.

In 1995  Synergetics envisioned  OWS as an engagement format to extend its “Cutting Edge Technology” competency building   training programs to individual software professionals to enhance their knowledge and skills as per their career aspirations and project delivery needs.

Synergetics is a premium brand in the Indian IT industry with an experience base of over 15 years in the area of people   competency development;   engaged in delivering it thru  its   training and consulting interventions , primarily focusing on their productivity with regards to the project and deliverables on hand .     Its primary differentiator has been its        solution centric approach and its comprehensive    client focused    service portfolio.

By Kevin Fogarty, on 26th Jan 2010

CIO – IT people with skills and experience in server virtualization, cloud computing or both have a far greater chance of getting and keeping jobs than most other IT people now, according to recruiters and analysts. But what do you call these gurus? There’s no accepted standard for what to call either virtualization or cloud-computing specialists, so jobseekers will have to look for a range of keywords-and include those in their resumes-to find a match with particular employers, says Dice.com spokesperson Jennifer Bewley.

If you are searching for a virtualization or cloud role, watch your search terms, she says.

Just using “virtualization” as a keyword, for example, pulled up 880 jobs on Dice.com on one day last week, for example, according to Bewley.

“However, there are another 900 jobs that include ‘VMware’ as a keyword with no mention of virtualization,” Bewley found. “That leads us to conclude that searching based on vendor is particularly important in virtualization jobs.”

Common terms for virtualization specialists include: Architect SAN/Virtualization; Citrix / VMware specialist or administrator; Data Center Virtualization Systems Analyst; and Product Manager for Large Scale Virtualization.

Cloud Job Searching Tricky People looking for new jobs described using “cloud computing” may not be completely out of luck, but they’re not far off, according to M. Victor Janulaitis, CEO of IT job-market researchers Janco Associates, which published this month a survey of CIO hiring plans for 2010.

“I have seen some people just use ‘cloud specialist’ to describe themselves,” Janulaitis says. “There’s not really a set of terms yet that are common to refer to cloud computing skills-they just refer to them as architecture or infrastructure skill sets.”

[For timely cloud computing news and expert analysis, see CIO.com’s cloud computing Drilldown section. ]

Other people just add “cloud” or “virtualization” onto more common titles such as system administrator, systems engineer, architect, or network engineer, Bewley says.

Remember, it’s not unusual for employers or prospective employers to use the name of a particular vendor or new technology as a primary identifier in a job ad, especially if they’re looking for someone certified in that vendor’s technology, according to Tom Silver, senior vice president of tech-job ad site Dice North America.

Normally a company would look for technical skills in a particular technology, not just one vendor’s products, Janulaitis says.

“With the economy the way it is, and now people are talking about the possibility of a double-dip recession, a lot of companies are just looking for basic skills and experience,” Janulaitis says.

That means plugging holes where they have to-by hiring one person with experience managing VMWare servers, for example-or a lot of junior-level generalists who can be trained in the skills that company needs, Janulaitis says.

Virtualization Salaries Flattening? Salaries for virtualization specialists have also hit a plateau, though the ads for them increased 30 percent in 2009 compared to 2008, Silver says.

After surging 10 percent in 2008 and into 2009, salaries for virtualization experts were flat this year at an average of $84,777, Dice.com data shows.

That’s still a premium compared to the national average of $78,845 per year for other tech workers, however, Bewley says.

From itvoir News, Oracle Demonstrates Leadership in Enterprise Product Lifecycle Management,

Oracle announced that since acquiring Agile Software Corporation in July 2007, the company has been positioned as a PLM leader by industry research firm Gartner, as well as ahead of its competitors, by AMR Research, solidifying its leadership position among Enterprise Product Lifecycle Management vendors. With Oracle‘s Agile Product Lifecycle Management, Oracle is capitalizing on opportunities to provide Enterprise PLM solutions to new organizations around the world, and has experienced significant customer momentum across multiple industries including consumer goods, life sciences, high-tech and industrial manufacturing.

PLM continues to evolve from an engineering-centric tool used to manage complex product design information to an enterprise-wide business application for managing product information, streamlining business processes and enabling better decisions across the complete product lifecycle. Research published by AMR Research in December 2007 entitled “PLM Market Landscape: Evolving To Enable Value Chain Excellence” found that end users are “increasingly looking at PLM as a standard enterprise platform” and asking how PLM systems “can be used to manage their product portfolios, capture customer needs and integrate non-engineering staff into the product design process, a domain historically dominated by engineers.”

Independent Research Firms Validate Strengths of Oracle‘s Agile PLM
Leading research firms have recognized Oracle’s Agile PLM for its market positioning, functionality and customer success. The recently published PLM vendor analysis by AMR Research examined market trends and offerings from various PLM vendors. This report included a PLM vendor landscape comparison that rated 11 PLM vendors’ focus across six key areas of functionality. Compared to the other vendors, Oracle‘s Agile PLM was found to offer a high-level of focus in the most categories, and was the only vendor to rank medium-high to high across all areas. Additionally, AMR Research cited strengths of Oracle‘s Agile PLM to include “industry-specific workflows and templates for FDA-regulated industries, like food and beverage and pharmaceuticals, and strong sourcing capability, which has gained traction particularly in high-tech electronics.”

From CIOL News

Wipro Technologies, the global IT services business of Wipro Limited, today announced that it has been cited by Forrester Research, Inc., an independent research firm, as a leader among Oracle Services Providers in a report in October 2009 titled ‘The Forrester Wave: Oracle Implementation Providers, Q4 2009.’

Forrester evaluated fourteen global vendors providing Oracle services across 60 criteria, the company said in a press release.

“Wipro is the strongest at the technical elements of Oracle implementations, but through Wipro Consulting, it also has a growing focus on process and transformational consulting,” the release added citing the report.

According to the report, “Wipro provides Oracle support and implementation/project services across most major areas of the Oracle family.” The report further adds that “Wipro has a heavy focus on high-tech, CPG/retail, and banking and financial services sectors.”

Elaborating on the strategy of Wipro’s Oracle practice, Langbert Walker, Global Head Oracle Practice –Wipro Technologies, said, “This recognition by Forrester reflects our position as a leader in the Oracle services space. Our strategic investments in building deep industry expertise and process capabilities have been instrumental in our success.”

On CIOL

Ovum The next three years will see Cloud computing mature rapidly as vendors and enterprises come to grip with the opportunities and challenges that it represents

MELBOURNE, AUSTRALIA: Cloud computing, the most important trend for 2010 has barely even started, says Ovum, an analyst and consulting company. The next three years will see cloud computing mature rapidly as vendors and enterprises come to grip with the opportunities and challenges that it represents.

Cloud computing – Been There Done That

Some prefer to limit cloud computing to infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS), whilst others also consider software-as-a-service (SaaS) and private clouds part of the phenomenon.

A wider perspective helps understand one of the key trends in cloud computing – cloud computing will be hybrid. “Enterprises will mix and match public and private cloud elements with traditional hosting and outsourcing services to create solutions that fit short and long-term requirements”, comments Laurent Lachal, Senior Analyst.

“The past 18 months have seen a significant shift in focus away from public clouds towards private ones owing to a powerful mix of vendor push and user pull”, said Lachal based in London. The private cloud is, to a large extent, a re-badging of what data centre-focused hardware, software and service vendors have been doing under different names (such as utility computing, autonomic IT, on demand data centre etc.) for the past 10 years.

Many users are wary of public clouds’ quality of service in areas such as reliability, availability, scalability and security but curious about the possibility of adopting some of their characteristics (e.g. on demand instant provisioning of IT assets).

Private clouds are either defined as the aim of the data centre evolution journey (a long patient maturation process) or as shortcuts along the way that push parts of the data centre ahead to deliver focused return on investment (the private cloud is the part(s) of the data centre ahead of the rest).

What is needed is a way to reconcile the two approaches (private-cloud-as-a-journey and as-a-shortcut) to understand when, on the road towards a next generation data centres, should users take shortcuts. Unfortunately, most vendors currently emphasises the second approach rather than trying to reconcile the two.

Cloud computing promises to tackles two irreconcilable (so far) IT challenges – the need to lower costs and boost innovation – but it will take a lot of efforts from enterprises to actually make it work. Instead of a nimbler IT with their IT mess for less somewhere else, the ill-prepared will end up with their IT mess spread across a wider area”, said Lachal.

Lachal believes that adoption is a two-way street. “It is not just about whether cloud computing is ready for enterprises, it is, more importantly, whether or not enterprises are ready for it”, said Lachal, author of the report.  The fact is that many enterprises are currently not particularly ready for either private or public clouds or any type of hybrids in between.

Besides the current confusion as to what exactly Cloud computing is, many enterprises lack the knowledge, skills and metrics to figure out what is best for them. They need to be able figure out how to mix and match:

·Totally private and shared private clouds (to collaborate with partners on common goals).

·Public and private clouds, with public clouds used, for example, for workloads that have unpredictable spikes in their use, for application that are only occasionally used or to turn the pre-production infrastructure (used for test, migrations etc.) into production one and use public clouds instead (since pre-production tasks have much lower requirements in terms of quality of service than production ones).

·Public clouds and traditional hosting/outsourcing service offerings: for example hosted offerings are usually cheaper for static web sites than the Amazon IaaS service. On the other hand, for use such as application testing, where a handful of server is required for a few weeks and a few hours per day, Amazon IaaS is the answer.

·Pubic clouds offerings (IaaS, PaaS and SaaS), based on their respective cost effectiveness.

To do so, they need to improve their knowledge of which asset cost what in public and private clouds as well as traditional hosting/outsourcing service offerings as well as their ability to monitor, meter and bill usage.  Few enterprises can currently do so. Achieving all of this will take time and tears.