Monday, May 14, 2007
What's so good about Virtualization?
In the old world circa dcom/corba etcetera the data conformed to a hammered out standard... which, however, was also proprietary.
Along came XML and provided a very simple language with which the data structure and content could be told to any software as long as the software designers followed the XML 1.0 specification and the data thus had an opportunity to be "virtualized" or made malleable as a concept and operation. Made malleable meaning the elements of that software code could be presented in a virtual sense (a massaged copy of) appropriate to the need and uses presented... likewise virtualized.
The data still had to be converted from XML to proprietary so the proprietary system could digest the information (unless the proprietary system is also written in XML in which case even your computing language is then virtualized - but we advance tangentially, so...) but the XML could be expressed to any machine enabled to handle XML 1.0 and the machine would at least understand the data.
From that simple model, commands have been added to the data being passed universally and we now have an opportunity to use XML to virtualize operations as well as content expression and format heritage.
Today, XML theory as embodied in the VCSY IP bundle allows any proprietary code to be wrapped in an XML layer that makes expression of the data and commands within the proprietary code "arbitrary" meaning nobody cares how it's expressed as the XML processing at each machine will allow that machine to digest what is being provided.
Virtualization offers to make any operating system run on any platform with any application built out of any components and do so with no low level abstraction (programming code and methods/properties) presented to the user. That abstraction stays in the designer/developer level interfaces... of which all interfaces are also virtualized so your display works anywhere.
Now, perhaps one can see the true Vista Microsoft is giving up to the competition as they neutered their vaunted Viridian virtualization super/hypervisor.
I'm asking you eggs to think beyond competing with something like VMWare for hosting virtualized operating systems and computing resources and thinking of virtualization as a gateway to ALL operations including automation.
Virtualization has a great deal more to do with how software creation may be automated than one would think. Put simply, virtualization allows the developer to make any component take on the traits appearance and capabilities of other components as desired for the applied use. A uniform topology emerges from a pile of different parts with virtualized joints allowing each chunk to be itslef and work its own way and use the virtualization layer to present the proper user face/look and feel/capacity. The original code does not change and, in fact, may be a piece of code installed within a running application the arbitration system decided was being underutilized... the borg works. That's all the borg knows to do.
Read on McNuff:
Virtualization smirchualization wheres the bar?
Sunday, May 13, 2007
Blasts from the future.
If you're going to follow VCSY you need to know a bit of the .Net history in news = http://msd2d.com/News_view_03.aspx?section=dotnet.
If you are an IT manager trying to get a grip on Web Services and what you need to stay in the game, you DEFINITELY need to read the following from July 17, 2003:
begin article
http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2914294,00.html
.NET: A guide for managers
By Jason P. Charvat
July 17, 2003
excerpt
What is .NET?
.NET is essentially a set of software technologies designed to connect your world of information, people, systems, and devices. Therefore it’s an environment for building, deploying, and running Windows applications and services. A fundamental idea behind .NET development is the idea of common objects--objects that are accessible anywhere, anytime, and from any device. Also, .NET technology itself is based on XML Web services, which use standard protocols such as SOAP and XML data to connect applications and Web services.
end excerpt
end article
If you are a developer, I want you to think through what you're going to do in a world where your work is arbitrated into commodityship at an astounding rate. Once the managers catch on to virtualization and arbitration powers, you're going to be relegated to programmer farms to support the individual objects Microsoft will need to keep their islands of automation universe perking along... while the rest of the world does the next level up in abstracted application building... and kicking your ass with your own products.
begin article
http://www.angrycoder.com/article.aspx?cid=1&y=2003&m=8&d=8
Compact Framework Blues
Jonathan Goodyear, MCSD, MCAD, self-proclaimed Internet Bad-Boy
8/8/2003
excerpt
XPath is another victim of Microsoft's attempt to reduce the size of the CF re-distributable. In this online whitepaper, Microsoft's suggested alternative is "a combination of recursive and iterative searches against the Document Object Model". Yeah...I'll bet that'll perform well; Especially on a PocketPC device with limited memory and processing power.
end excerpt
end article
Yes, oh skeptical one, the article only references .Net in a mobile framework such as CE. but the very idea you would excuse .Net on that point outlines the problem with .Net. A different product has to be done and cobbled for different situations entirely... rather than delivering on the "any object" "any where" "any time" claim central to .Net marketing and central to any truly arbitrary framework. .Net in this incarnation was far from an arbitration framework much less a system using arbitrary objects. And the complaint that I would be unfair to .Net as a whole based on a selective view as "But this is .Net for a mobile platform" is either an intellectually dishonest feint or a "design only what they ask for" approach to engineering.
Arbitration removes uncertainty and provides machine readable and actionable governance in order to be autonomous. The truly arbitrated "object" may be applied by machine driven frameworks rather than a programmer having to decide which development kit to buy.
Now, look at the Microsoft concept and then take a look at the IBM concept:
Microsoft will not get virtualization capabilities out to public developers until 2008.
IBM has been employing virtualized components, applications, systems, frameworks... the works since 2001 in development and client businesses.
IBM will be providing automated service systems when Microsoft is first rolling out systems to be used by humans first. THOSE developers will be the ones expected to build out autonomous services once Microsoft's equipment comes online for them.
Why? Why should they? They will be competing with a race of technological giants that will flick them like fleas. Oh, Microsoft will still be cool selling, as it were, to their userbase (designers and developers) while their base fades into irrelevancy... enabling MSFT to buy them at a five and dime price and then join the 'autonomous services' paradigm around 2010... right about when they said such would be blooming.
Those Microsoft management types they are some smart, no?
Sure, there will always be programmers. I know the programmer has to get an object out there in the first place, but we are today virtually swimming in objects of every kind by any kind of proprietary re-invent of the wheel... and the best out of the pile will make it's way to the top of the application exchanges 'best of breed' for inclusion into new, future and legacy developments of the composite application.
There probably is nothing of substance in actual data processing done that has not been embodied in some perfectly usable body of cobol or fortran forty years ago. It's just you can't use that code unless you first virtualize it so it can communicate and interoperate with other code bodies ( look at US 7,076,521 and you can see how virtualization works with a client agent handling the mediation between 'you' and anybody else you have to hook up with and have intercourse with (don't be so shocked. data architects have been talking like this for years. it's all you people who demigraded the codewords.)). Once all the code bodies are integrated the overarching framework US 6,826,744 that glues and holds and implements this aggregation becomes the application. It's like cells virtualize in the goo and the body comes together as each part plays its part.
Do you SEE how far Microsoft actually is from this realization (expression + experience = realization)? Yes, they will get there eventually and it will be such a surprise to the rest of the world who forgot about what they were doing long ago so their public base will be impressed... eventually. That is if all these developers stay with .Net... or if .Net survives. A whole lot can happen... and each different track may be projected and somewhat predicted as we're in a very narrow technological pass from one paradigm to another.
Anyway, as with all other technological disruptions, "Thanks oh pilgrim prospectors and developers and programmers. Now, get off the settler's land."
We are leaving behind programmers in the VCSY paradigm whereas Microsoft in their past, present and future .Net paradigm has a vested interest in keeping the programmers plugged into the action and maintains that regime today with the divesting of functional integration with content/format (look and feel) GUI in one body of developers (designers) using expression and the marketed invisible line separating them from the designers (developers) in .Net Framework.
TODAY... NOW... Microsoft is making squeaky toddles over to a total user-defined experience, but they've rather chosen to mince steps with words and offer tools for Expression (content and format [look and feel]) and then end up having to issue the same tools (different names, of course) for functional assembly to the designers. THAT must have been a hard decision because I believe it put them in a do or die situation... the cutting of three key elements of their virtualization program essentially eliminates Microsoft being able to act as anything more than a Microsoft managed assembly of server units AND places hardware constraints on their versioning so they will always be a slave and never a master in an autonomous server bank. They will always need humans for specific gerrymandered tasks and thus will mask their true cost of implementation in their 'low cost software packages'. You will need many more developers per project than the 'high cost' application services resulting in many more problems, issues and cost than the web service as an application deliverables.
AND, when a composite application is built in the real world, there will need to be an army of developers to make that same thing happen in the .Net Framework implementations due to this set of limitations to virtualization with outside resources in software/hardware.
NOW we see why Software AND a Service are so important to Microsoft. They'll lose their developer base if they don't give them something to do and an artificial economy is better than no economy at all.
I don't think Microsoft intends to compete with virtualization. The question is why? To save their developer base? Are you kidding? If Mom and Dad and Junior could make Microsoft applications don't you think MSFT would slap that out there and let the developers swim on their 'let a professional' floaties?
I don't think Microsoft will be given a license to legalize.
.
Remember the dBase interface?
.
That meant you were supposed to tell the database what to do. Here we are twenty years later and we STILL have to tell the stupid things what to do. Now that the machiens will understand what a command line can convey, who needs the fancy shmancy graphics tools for objects? Why? Tell your computer what you want and get off the button pushing treadmill the graphic user interface demands. A whole lot of money for very little facility in this newly "virtualized to the gnat's pucker" world. Thanks for the memories, Microsoft. Pardon our dust.
EVERY element of an arbitrary distributed system must be ready to take into consideration the minimal platform... therefore, EVERY definition of a distributed architecture must be viewed as a "mobile" platform even if every last computer has a lead weight in the bottom for ballast so it can't be moved. the very nature of the uncertainties of network routing and network outage demand the marshaling of metadata and "location" is only one more piece of information.
.Net as it was originally built would not do those things. Why? Because .Net relied on SOAP and SOAP relies on all the intelligence and business logic reside in servers rather than in client agents at the site of the user activity. If .Net has gone 'the other way' and that is by client agents... well, buckarooos, not only is .Net infringing, but they have built many many products and client products that do thuswise also.
And, if Microsoft has been playing hardball the way they appear to ME to have been playing hardball (it doesn't take a conspiracy to make things happen) I would say in Mister Wade's eyes it's going to be the people who've worked loyal-like with VCSY these years who are going to be rewarded with a license and Microsoft and their minions and pinions will get no grease but fire.
Most software design is done with an eye toward cleanness under the reigning situation. "What will not take place" does not have to be designed for in most applications. BUT a general purpose application or a framework that's touted to be "all inclusive" can't take that track no matter how attractive the monetary rewards will be as you are stunting the platform for money... or lack of innovation or property.
You can bluff your way through when you're the one doing the delivery of services and applications as Microsoft did during this period. But, if there comes a time when the machines start distributing your work, you had best hope nobody slipped a small device on their desktop linkup or it's the proverbial monkey wrench in the gearbox.
As we move closer to an autonomous server and service mentality, Microsoft will appear more and more antiquated. Nothing personal, just an observation. It's already been happening in a small scale as a result of developers getting hold of Vista and asked "Is THIS what we've been waiting for?".
What happened Friday is about to enlarge that scale immensely.
pd
Friday, May 11, 2007
Worth spooning through for areas of concern:
http://astoria.mslivelabs.com/faq.aspx
General
Q: What is project codename Astoria?
A: Astoria is a project in the Data Programmability team at Microsoft that explores how to provide infrastructure and tools for exposing and consuming data in the web. Astoria can create data services that are exposed in a natural way to the web, over HTTP and using URIs to refer to pieces of data; these data services can be consumed by AJAX front-ends, Silverlight-enabled web pages, desktop applications and more. At this time we’re making available two experimental elements of project Astoria: the Microsoft Codename Astoria toolkit and the Microsoft Codename Astoria online service. The “overview” document that is available at the Astoria web site provides a more complete introduction to Astoria.
Q: What is the difference between the Astoria CTP toolkit that is available for download and the Astoria online service?
A: The Astoria toolkit consists of a set of runtime components, documentation, samples and Visual Studio integration elements that allows developers to create and consume Astoria data services in their own ASP.NET web applications. The Astoria online service is an experimental deployment of the Astoria toolkit plus added infrastructure in the Microsoft Live Labs environment that can be accessed over the internet. The online service includes a number of pre-created sample data-sets exposed as data services, and soon it will offer the option of creating custom data services to allow for further experimentation with the technology using custom schemas and custom data.
Q: When will Astoria be available for deployment in production environments?
A: We are making this early prototype available publicly to gather feedback, understand customer needs and validate our assumptions about application scenarios. We will plan the rest of the release cycled based on that feedback. Currently there is not fixed schedule for releasing a production version of Asotira.
And also this:
Microsoft misses performance and scale goals with Viridian – Microsoft
One month trip from gloating to tears
Viridian has gone on the Redmond Diet with Microsoft today ripping some of its most exciting planned features out of the virtualization software.
In April, Microsoft's GM in charge of Viridian Mike Neil revealed that the company would have to delay the software's beta release from the first half of 2007 to the second half. The reason for the delay? Well, Microsoft wanted to add in things such as support for 64 processors – "something no other vendor's product supports" – and on-the-fly addition of processors, memory, disk and networking. Such technology was needed so that Microsoft could "(meet) our internal goals for performance and scalability."
... connect those dots maybe with a tourniquet. There's more at the article and then this excerpt:
These issues have some analysts calling for blood.
"Microsoft has a fundamentally broken server virtualization strategy at this point," wrote Illuminata analyst Gordon Haff. "They were behind to begin with. Now, the tardy 'Rev. 1.0' is starting to look more like 'Rev. 0.5.' Perhaps it’s time for Microsoft to consider a different angle.
"Perhaps it’s time for Microsoft to admit that they can’t do it all themselves - at least for now - and form some legitimate partnerships. That would mean fixing some licensing problems and eating some crow. But that’s the cost of a broken internal development process. That Mike Neil should make reference to the 'mythical man-month' in his posting is wholly appropriate." ®
4 comments posted — Post a new comment16 cores is a limit.. Posted 19 hours and 39 minutes ago
Sun ...out of luck...Intel ... http://www.reghardware.co.uk/2007/03/02/intel_bloomfield_to_debut_lga1366/ ...need 32 core support... IBM's ...UNisys ... out of luck...Hey it's cheap stuff Posted 18 hours and 11 minutes ago
Proper virtualization? Posted 4 hours and 59 minutes ago
On IBM gear (even low end 505 machines) Dynamic partitioning operations are possible out of the box (as long as you have a Hardware Management Console).
Power6 Posted one hour and 10 minutes ago
Just wait until July when Power6 arrives. You will have the ability to migrate running Logical Partitions between servers.
Wednesday, May 9, 2007
Finally... unstructured support... 2008????
This #1: http://www.idugdb2-l.org/adminscripts/wa.exe?A2=ind0609c&L=db2-l&P=10712
is what difficulties a proprietary language bound database has at creating independence.
This #2: http://www.idugdb2-l.org/adminscripts/wa.exe?A2=ind0609d&L=db2-l&P=951
is how the DB2 users use DB2 9 (codenamed Viper) to create virtualized versions of other proprietary databases made XML enabled.
Odd but #1 comes up in a google before #2 meaning somebody's only getting part of the story and seeing nothing but database problems when, in reality, #2 describes how the DB2 9 users create their own independences.
excerpts from #2:
The whole technical stuff is applied by the generator. So it
is easy to add new methods (in minutes)...
For DBMS independence, you can have more than one generator (which are similar to each other...
...we have generators for DB2,ORACLE and MySQL. ...
...we can sell our application to users of these three platforms without problems.
I always wanted to add MS SQL server, but didn't need it until now (no customer requirement)...
...almost no performance penalties, because every DBMS can have its
own SQL syntax (if required). And the little overhead introduced by the call of the submodule is a price that we are willing to pay.
end excerpts
Those of you struggling since 'Yukon' to make a SQL Server database handle unstructured material might want to give the folks at the user group in this email a tug. Either that or wait until next year for even a shot at the unstructured data world via Microsoft applications.
Why do you people put up with this kind of treatment from your vendor?
May 9th, 2007
Microsoft makes it official: SQL Server ‘Katmai’ due in 2008
Posted by Mary Jo Foley @ 8:39 am Categories: Database, SQL Server, Corporate strategy, Code names
Early reports turned out to be true: Microsoft is planning to release the next version of its SQL Server database, which is codenamed "Katmai," in 2008.
Microsoft announced officially its target date on May 9, the opening day of its first Business Intelligence conference in Seattle. More than 2,600 customers and partners are attending the three-day event, according to Microsoft.
Microsoft is expected to begin private testing of Katmai in June.
Microsoft's most recent version of SQL Server was released in 2005. Last year, Microsoft officials said the company wanted to accelerate the pace at which it delivered database releases by releasing fewer (if any) full-fledged beta builds, but stepping up the number and quality of Community Technology Preview (CTP) SQL Server builds. By obtaining tester feedback more regularly and rapidly, Microsoft's SQL team hoped to be able to release a new version of SQL Server every 24 to 36 months, officials said.
Microsoft isn't yet providing a full feature list for Katmai. But the SQL Server team is promising the new release will integrate with Office 2007, SharePoint Server 2007 and PerformancePoint Server 2007, which is Microsoft's business-scorecarding application. The Katmai release will be able to provide "reports of any size or complexity," according to the Softies, and manage "any type of data, including relational data, documents, geographic information and XML."
Katmai isn't the codename for SQL Server only. Microsoft officials also have used "Katmai" to refer to the next version of Microsoft System Center Operations Manager.
In a related move, Microsoft also is announcing on May 9 that it is acquiring OfficeWriter, a company which makes a Java reporting tool that enables people to generate Microsoft Office documents from any data source through a Web browser.
End article
Meanwhile back at the ranch:
What does database independence mean and why is it such a big deal to me?
First, each database from every database vendor such as Oracle, Microsoft, SAP, Bob's Data Boutique, have a certain set of tools to use to make the database operate. The thing is an application and it is optimized to be an application for handling bodies of data.
The traditional applications handle 'structured' data in the form of relational records and tables.
IBM came out with the first means of combining the 15-20% of all data that is relational or structured with the 80-85% of all data that is not structured and therefore must be dealt with in a different manner as the more traditionally handled structure boxes of data we come to think of as comprising the database industry.
So DB2 is a proprietary database like the others but with an engine or generator working on top of that IBM proprietary application that allows the data within the box to be expressed along with any unstructured data such as documents, photos, videos, presentations, code elements etc.
If IBM has been giving developers the ability to work with unstructured data and SQL Server maintained data... why is Microsoft not able to provide such capability as of yet.
Waiting for another industry market to mature? You're giving IBM a two year lead and that's letting things 'mature'. I have news for you, friend. Read VCSY patent and tell yourself... there's another way to transactionally virtualize proprietary data to arbitrary data in XML across http... and get your crayons out and do me a paper bag with something that can transactionally and thus deterministically transmute data from proprietary datastore to XML and back.
And then drop me a line and explain how you would build your database vendor independence that the DB2 9 Viper generator users seem to be able to so easily mate up with any vendor's database.
Any vendor's database to work unstructured data and Microsoft developers have to wait until 2008 to do the same thing.
Uhhhh... yeah. Paper bag. Crayons. Draw.
Why the crayons? So you can keep it real simple.
Why the paper bag? In case you hyperventilate trying to come up with an alternative method... or in case you spew your cookies once you reach the end game.