Tuesday, December 18, 2007

8007000e error when deploying Analysis Services 2005 project

I had some kind of machine specific project corruption, possibly to do with SourceSafe. Before troubleshooting your project, try deploying from another workstation - this worked for me.

Friday, December 14, 2007

Visual Studio using analysis services based report models is broken

Designing reports using a report model based on an Analysis Services database seems such an obvious and natural thing to do.

  • Analysis Services provides a rapid querying engine
  • The report model requires no development to generate
  • Report developers can avoid having to learn MDX
Unfortunately this appears not to have been an area of focus for Microsoft. Prior to SQL Server 2005 SP2 it was workable but annoying, but now it is unusable.


"A connection cannot be made to the database. Set and test the connection string. Additional information: TODO - change code to use RsProxy object (Microsoft.ReportingServices.QueryDesigners)"

This error is thrown on the data tab of a Visual Studio 2005 Reporting Services project if:

  • Your data source is a report model generated from an Analysis Services 2005 database via Report Manager
  • You have SQL Server 2005 SP2 Client Tools installed on your workstation
  • You have manually added Perspective xml to your rdl code in an attempt to work around a known issue (Perspective not specified automatically for a report created in Report Designer)
If you remove the Perspective xml then the data tab won't error, but the prior known issue will kick in with one of the following errors when you attempt to execute the query or preview the report:

"An error occurred while executing the query. Failed to execute query. Additional information: Failed to execute query. (Microsoft.ReportingServices.DataExtensions)"

"The ‘PerspectiveID’ custom property for the ‘query’ perspective is either not set or is not set to the string data type."

Plenty TODO if you ask me.

Tuesday, October 23, 2007

Wherescape Red and Oracle Express - take 2

The Validate Privileges option in Wherescape Setup Administrator (WSA) is supposed to assist you in getting your admin user granted up and ready to go.

So based on its recommendations you go into the XE home page and tick the following:
  • Create Database Link
  • Create Materialized View
  • Create Procedure
  • Create Table
  • Create View
and then run a little script as follows:

grant select any table to bob;
grant query rewrite to bob; --why the heck is this deprecated stuff here??
grant execute on sys.dbms_lock to bob;
grant select on sys.v_$session to bob;

and you should now be good to go, right?

Nope. If you haven't created any tablespaces and you don't select any on metadata creation, you will get a ton of errors related to availability of tablespace "USERS". Where did this come from? I didn't specify a tablespace anywhere and there doesn't appear to be a WSA default tablespace option, but WSA has gone and decided to plug it in the metadata creation sql.

The quick and dirty for getting round this is to give bob DBA rights. Thankfully this is just a tickbox away in XE.

Friday, October 19, 2007

Getting Oracle XE on Windows with Virtual PC to work

Pretty much every time I have installed Oracle XE on Windows without taking any preparatory steps, I have had:

  • the home page go "page not found"
  • "shared memory realm does not exist" errors in sqlplus
  • port 8080 failed to come up in lsnrctl status/netstat -an find "TCP"
  • inaccessible Oracle odbc dsns with "tns:no listener" errors
My main preparation steps before installing XE on a Windows VPC:

  1. Install the Microsoft Loopback Adapter
  2. Use a static IP address
  3. Disable all other networking
If this doesn't get it working, forget about checking whether the services are up - they will be.

Instead open up a command prompt and enter the following:

sqlplus / as sysdba
startup;

And this gets rid of the various problems I experience. This is in no way comprehensive. Oracle seems to be happy to leave their customers to trawl google for laundry lists of symptoms and fixes compiled by various bloggers and tweakers.

This isn't a bad article (though not specific to XE).
http://www.dba-oracle.com/oracle9iAS_tips_windows_service.htm
Neither is this
http://psst0101.wordpress.com/2007/06/13/getting-oracle-10201-to-start-automatically/

Friday, September 28, 2007

isnumber in sql

http://laurentschneider.com/wordpress/2007/09/isnumber-in-sql.html

select x,
case when
regexp_like(:x, ‘^[+-]?(\d+\.?\d*\.\d+)([eE][+-]?\d+)?$’)
then to_number(x)
end n
from t;

Monday, September 24, 2007

Wherescape Red and multi stage ETL

In Red, you associate dimensions with a fact in the Stage table. If your load table has data that needs to be transformed before it can be used to join to the appropriate dimension business keys then you may want a 2 step stage process to keep the transformations readable.

For instance if your load table has a column that contains data in the form ddmon (e.g. 01SEP) this can be converted to a date before joining to dim_date to get the dim_date_key. So stage_table_1 would have a date column derived from the ddmon column and stage_table_2 would have the date column plus a dim_date_key column using the relevant join criteria.

Of course stage_table_2 could be derived in one step but my preference is to keep the transformations simple.

Wednesday, September 19, 2007

Wherescape Red and data repositories

I wanted to create a separate data repository from WslWarehouse (which is full of metadata), so I created a new SQL Server database instance called Test_Warehouse on the same server.

I had to create a system DSN called Test_Warehouse pointing to this instance. I don't see the need for this step. Red supports SQL Server in theory which means that I should be able to specify the server, database and credentials somewhere.

Then in Red I right-clicked Connections, selected New Object and specified properties as follows:

Name of object: Test_Warehouse
Connection Method: IP
ODBC Source: Test_Warehouse
(No need for credentials as I specified Windows Authentication in the DSN)

The next step is to get the load table to create itself in Test_Warehouse. I thought that Change Connect/Schema would do the trick but it made no difference. After much messing around and consulting help files I still didn't figure out how to do it. So it's back to mingling meta and data...

Wherescape Red and fixed width text source

Two annoyances:

1. The data load wizard doesn't let you throw anything away in the source. You have to create junk columns to hold separators and other unneeded file content.

2. If you use the Back button in the data load wizard, it appears that you lose your column specifications. Actually the column definition has already been created and you are creating a duplicate. Clicking ok to Create and Load when you finish the wizard will not work because of this, and you will have to go back and delete the old column definitions.

If you get the following errors, create a c:\temp directory.

Attempt to bulk insert has FAILED![Microsoft][SQL Native Client][SQL Server]Cannot bulk load because the file "c:\temp\wsl6.fmt" could not be opened. Operating system error code 3(The system cannot find the path specified.).

[Microsoft][SQL Native Client][SQL Server]Cannot bulk load because the file "c:\temp\wsl6.fmt" could not be opened. Operating system error code 3(The system cannot find the path specified.).

.

Wherescape Red and creating facts

Much the same as dimensions but I got stuck going from load to fact - I eventually figured out that stage tables are not just a nice to have, they are pretty much essential as there are wizard steps that won't appear when constructing facts directly from load tables.

Tuesday, September 18, 2007

Wherescape Red and projects

At first I couldn't figure out how to drag-drop or multi-drag objects into a newly created project. Surely I wouldn't have to do them one at a time?? Turns out the trick is to select the container (for instance, Load Table) then make my selections from the middle pane, right-click and Add to Project.

Thursday, August 23, 2007

Wherescape Red and creating dimensions

My data source is a very denormalised reporting-style source system extract. I need to pluck out the dimensions and facts that are embedded in this source data, and I was hoping Red's much touted code generation would help me through this.

Dimension #1
- Dragged a load table into the dimension middle panel and specified the name
- Selected Normal dimension type
- Clicked OK to accept definition
- Clicked Finish to avoid creating the dimension based on the load table
- Deleted lots of irrelevant columns
- Right-clicked the dimension (in the left panel) and selected Create (recreate)
- Right-clicked the dimension and selected Properties
- Specified Update Procedure as ***Create New Procedure ***
- Selected the business key column
- Right-clicked the dimension and selected Update

Phew, I'm done! It didn't take long but I had a lot of steps to remember. It strikes me that a suitably constructed wizard could have completed a cycle of creating multiple dimensions and facts from one load table.

Dimension #2
More of the same, except that the Update failed with error:

-2 dim_2 update FAILED with error 2601 Step 100. Error Msg: Cannot insert duplicate key row in object 'dbo.dim_2' with unique index 'dim_2_idx_A'.

The dim_2 table was partially populated, so it appears a specific cursor row upset it partway through the data load. My first guess was there were multiple cursor rows for the same business key. However queries on the load table didn't find any.

Turns out that the generated update procedure doesn't handle nulls. To fix this I had to:
- right-click the dimension
- select properties
- click Rebuild for the update procedure
- tick Allow where clause editing and click OK to leave the business key as is
- type where business_key_column is not null and click OK

I guess there may be a tick box somewhere to make the procedure ignore nulls but I couldn't find it.

Now on to dimension hierarchies! I right-clicked the dimension and selected Add Hierarchy and it only offered me the option to reorganise the existing fields. I had to create separate dimension columns for calculated hierarchies by right-clicking the dimension and selecting Add Column.

Comments:
- The fields are mostly self explanatory but not very typo-proof, being free text fields.
- At various points it is necessary to update the dialog box which is done by clicking the Update button. Annoyingly this closes the dialog box and I had to reopen the column properties to be able to continue defining the column.
- The transformation tab could potentially hold some serious sql but is fixed size with limited IDE functionality.

Wednesday, August 22, 2007

Wherescape Red and Excel data source

I wanted to use a multi tab multi column Excel file as a data source. I didn't do any prior reading of the help file or documentation, apart from my previous walk through of the tutorial. Call it an ease of use test :)

I dragged the xls file in to the load table area and was presented with a dialog box Specify the type of load required. I selected the 2nd option File Load (columns parsed) which superficially appeared to be the right choice, but nothing appeared to happen. I couldn't get the corresponding load table to be displayed in the browser.

I closed and reopened the browser, and the load table was displayed in the browser. However as soon as I tried to look at its properties etc I got the message The specified load table could not be found. I performed a validity check of the metadata and a problem was reported: There are no objects in ws_load_tab with an object key of n.

I wasn't prompted with any metadata cleanup options and I couldn't find any repair functions on the menus. So I opened up the wslwarehouse database in SQL Mgt Studio* and poked around in the metadata, eventually identifying the load table object definition in ws_obj_object. I deleted the offending row and ran another validity check which advised that the metadata was clean.

I've revisited the help file and I can't find anything specifically saying Excel isn't usable as a source, but I can't find instructions for loading Excel either. I think I can assume by omission that it's unsupported. Regardless this was pretty ungraceful error handling.

So I saved each xls tab as a separate tab-delimited text file and dragged them in. This time a dialog box popped up and the load definition process started. My thoughts on the wizard:

- It doesn't make any effort to identify the delimiter, header row or row separator. And char(9) isn't adequate to specify a tab; it has to be CHAR(09)
- It makes rather poor data type suggestions given that it's obviously sampling the data.
- Manual entry of data types aren't added to the data type drop down.
- The data happened to have a column header called Procedure which caused table creation and population to fail because it's a sql keyword. The wizard didn't identify this and change the column name to something else.

* Yes I've given up on XE/APEX which has become unavailable on reboot too many times for my limited patience. SQL Server doesn't die inexplicably on reboot, doesn't get upset when used on a virtual machine with no network adapters, doesn't stop working because it was installed by a domain admin, etc etc.

Monday, August 20, 2007

The Wherescape Red Sucks Honeypot Post

I've been writing boring notes to self in this blog, fully expecting nobody to bother commenting. Now that someone has actually written a comment, I'm completely freaking out! I can't help but suspect that the commenter has undisclosed interests and is engaging in viral marketing. A lot of vendor-aligned consultants these days are expected to provide product discussion forum feedback as a background task - perhaps some vendors have stipulated public comms responsibilities too for their consultants?

Apologies if you think this is all too cynical and even somewhat misanthropic.

Oracle Express and VPC

The Apex site was unavailable afte reboot, and sqlplus returned this error:

ORA-1034: ORACLE not available
ORA-27101: Shared memory realm does not exist

This thread discusses possibilities and suggests searching on OTN forums but the forums are currently unavailable!

I think it may be something to do with assigned ip address but discussion suggests Oracle should still be available via sqlplus.

Friday, August 17, 2007

Wherescape Red and Oracle Express

Before installing the Red tutorials, do the following to save yourself manual tidyups:
* Create tablespaces called DATA and INDEX
* Create a user called wtutorial

What does this error mean during tutorial install?

insert rows into ws_hie_link(hl_hie_key, hl_obj_key, hl_child_key, hl_parent_col_key) values(ws_hierarchy_seq.currval, 71, 027city#) [Microsoft][ODBC Driver for Oracle][Oracle]ORA-00917: missing comma

Haven't figured out what to do about Red's attempts to create bitmap indexes.

PerformancePoint CTP4 appears overnight like a tent city

http://connect.microsoft.com

Announcement came in the form of a handful of posts by independent bloggers.

I'm guessing that a VPC image will be available in about 3 weeks time based on previous track record.

Thursday, August 16, 2007

Don't use @ symbol in passwords if Oracle is in the building

The standard administrator password for Microsoft Virtual PC images is pass@word1

If you install an Oracle database on one of these images then the administrator will never be able to connect to the database.

connect user/password@destination

Friday, August 10, 2007

Forum post about asp.net locales

From bhughes10 on 9/28/2006 4:51:02 PM
I saw a question from Norman Yuan several months back with the same problem I'm having now. Never saw an answer.My currency that should be displayed as $200.00 is displayed as A200.00 randomly when my production server is under load. It doesn't happen on the dev computers. An IIS reset temp fixes it but soon returns. I'm running ASP.NET app, .NET 1.1, developed on VS.NET2003. Windows Server 2003 Std. with latest SP, IIS6.0


From bhughes10 on 10/4/2006 4:43:01 AM
It's been 2 days with heavy loads on my server and no problems. I took your suggestion and created a new application pool for the site. Thanks Carl.
"Carl Daniel [VC++ MVP]" wrote:
> Norman Yuan wrote:
> > Still no solution here. Originally, I though it could have something
> > to do with our incapable server (PIII700Hz, 512MB, running Win2003
> > IIS6 and SQL Server2000 on the same box). But after upgrading to a
> > brand new, powerful server, the problem is still there. The $ sign
> > randomly changed, but not alway on the same web page. That is,
> > browsing the same page with different query string parameters, the $
> > sign sometimes is OK, sometimes is not.
> > Very frustrated with no cause found.
>
> This is caused by something running on a threadpool thread that changed the
> default locale. We ran into this issue with an ASP.NET application as well
> and through trial and error identified another ASP.NET application running
> in the same app pool that was changing the locale to German for some reason.
>
> There are two approaches you can take that might help:
>
> 1. Explicitly specify your intended locale (i.e. IFormatProvider) every
> where you convert numbers to strings.
> 2. Separate all applications into their own completely isolated application
> pools to keep locale changes from one application from "leaking" into
> another.
>
> -cd

Thursday, August 9, 2007

P2V, SSIS, asp.net and datetime

Recently I did a P2V of a production Windows 2003 server in order to create a dev environment.

Among other things it comprises an asp.net site that calls an SSIS package.

When I attempted to debug the existing SSIS package in visual studio, it failed with "string was not recognized as a valid datetime". When I checked the regional settings of the windows server I found they had not survived the P2V process and had reverted to US english. This post discusses the language translation issues that can occur with SQL Server, so as per previous blog post I updated the windows default regional settings. I left SQL Server as us_english as this is how it's set up in prod.

End to end debugging was fine after this. However when I deployed assemblies into the dev environment but I got exactly the same error at the point of calling the SSIS package from asp.net. Suspicion immediately fell on the app pool identity. However its database user has the same default language as all the other users - us_english.

I then noticed that Sharepoint on the same box was displaying US style dates. So it appeared that IIS or asp.net had a different default locale from windows! But no, updating the Sharepoint locale info didn't fix the asp.net site. It turns out that Sharepoint sites have their own locale info which is drawn from Sharepoint's own settings.

I decided to repair IIS but found that I couldn't get to Add/Remove Windows Components from the Add or Remove Programs control panel - I got an error as follows:

Setup library ocgen.dll could not be loaded, or OcOMEntry could not be found. Contact your system administrator. The specific error code is 0x7F.

Presumably this was another P2V side effect. I ran sfc /scannow but this didn't help.

Giving up on this, I went back to investigate the site's asp.net global configuration and noticed that the application settings had dates stored in the local format! So the translation was happening on the site itself. On the application tab I found that the culture and UI culture had changed to af-ZA (Afrikaans). Actually I suspected the values were blank and the dropdowns were just displaying the first entries on the lists. So I updated them to local culture and reset iis. The sites were now using local dates.

Some forum post

RE: SSIS, relative path configuration file references and asp.net David Norman2/28/2006 7:19:28 PM

Sorry, this information is not correct. I was viewing logs that excluded OnWarning messages.
As far as I am currently aware, the only guaranteed place to put configuration files where they will be found by an asp.net site using an app pool identity other than Admnistrator is in \windows\system32\inetsrv

"David Norman" wrote:
> Config files for child packages called from asp.net will also be found during
> execution relative to this directory:
>
> \Documents and Settings\App Pool Identity User\
>
> So to have two separate package collections on the same server e.g. for test
> and live, you need two sites using different app pools with different
> identities. Otherwise both collections will use the same config files.
> > I think the way to structure this so you still have the flexibility to run
> the packages both directly and via the site is to put copies of the config
> files in the app pool home dir. The downside is you can get out of sync and
> access to the directory needs to be opened up to more people.
>
> But I won't add my voice to the chorus of SSIS bloggers who trumpet the need
> to follow disciplined practices - template packages, carefully designed
> folder structures, etc. These are just workarounds for a not very easy to use
> environment.
>
> One thing I believe IS good practice is to not edit live packages directly.
> If you're happy to hack away at the config file paths in live .dtsx files
> then you can you ignore these posts entirely.
>
> Hope this helps somebody, it's taken me a long time to figure all this out.
> David
>
> "David Norman" wrote:
> >
> Found it!!!!! Desperation wins through!!!
> >
> > \Windows\system\inetsrv (app pool location)
> >
> > But it's not going to be a very well structured package environment. And I
> > don't know if I can different locations for test and live will be possible.
> >
> > All in all, fairly ridiculous if you ask me.> > > > "David Norman" wrote:
> >
> > > I have a package collection with one parent package that calls several child
> > > packages that share a common configuration file referred to using a relative
> > > path (everything is in the same folder).
> > >
> > > If I put the child packages and config file in a different folder execute
> > > the parent package, the child packages fail to find the config file. If I put > > > the config file with the parent package, the child packages find it.
> > >
> > > With an asp.net site - the child packages don't find the config file in
> > > either their own or the parent package folder.
> > >
> > > This has led me to believe that the location where the "executable" is
> > > launched dictates where the config file needs to be. On this basis I have
> > > tried putting it in the following (increasingly desperate) locations:
> > >
> > > virtual dir root
> > > virtual dir bin
> > > site root
> > > drive root
> > > \Program Files\SQL Server\90\Binn (location of MsDtsSrvr.exe)
> > > \Windows\System (location of W3SVC)> > > the "My Documents" directory of the app pool identity
> > >
> > > One exception is that the config file is found if the app pool identity is
> > > an administrator of the SQL Server box - but this is not an acceptable
> > > solution. However at the rate I'm removing security in my search for a
> > > working setup, it won't be long before everyone's an administrator anyway!
> > >
> > > Can anyone give me any guidance?
> > >
> > > Ta in advance,

Multiple SSIS environments on the same server

Got Dev and Test (and even Prod) environments on the same box? Want to avoid hacking the package xml when promoting SSIS solutions? There are a couple of configuration options:

1. XML configuration file

The key here is to use a relative path. The wizard guides you to click the Browse button and specify an absolute reference e.g. c:\ssisdev\config.dtsconfig. Instead, just type in config.dtsconfig. SSIS will look for the config file in the package location.

Depending on how you structure your solution, you don't necessarily want your config files in the same location as the packages that use them. I suggest you give up on this now and accept all config files will have to be in the same location as the packages that use them.

This is because longer relative paths such as ..\config.dtsconfig don't work (this particular example saves the config file to C:\Program Files\Microsoft Visual Studio 8\Common7 !!!)

2. SQL Server

This is a two phase approach. Firstly, set up a SQL Server configuration table. As part of this process you will reference or create a database connection. Then (and this is the cunning bit) create an XML config file with a relative path (as per option #1) that contains the connection string for the connection. You can then specify different configuration table locations in the xml config files.

This option also uses xml config files and is hampered by the same difficulties as option #1. You will need a separate config file for every package location, and you will need to maintain these config files individually. You could design to ensure your config files are all identical and write a script to copy them as part of the build; whether this is worth the effort depends on your solution.

SSIS package deployment does SOMETHING

Everything I read out there says that SSIS package deployment to file system does nothing more than a file copy.

I created some SSIS packages and called them from an asp.net site on the same box. However the asp.net site couldn't find the packages until I deployed them to the file system.

So deployment does something, either to the permissions of the deployed packages or by registering them as available somehow. Is there a hidden GAC equivalent for packages (GPC??)

Monday, August 6, 2007

SSRS setup

Tried to bring up http://server/reports and got the error:

The report server cannot decrypt the symmetric key used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. Check the documentation for more information. (rsReportServerDisabled) (rsRPCError) Get Online Help Bad Data. (Exception from HRESULT: 0x80090005)

Following instructions here got me nowhere as I got the same error on attempting to extract the symmetric key.

Apparently this is the wrong approach for SQL2K5, I found in the following comment:

For Sql Server 2005 you can change, backup and restore via Reporting Services Configuration Manager.
If you have no backup of the Encryption key you can delete the key and re-create using the change option. BUT BE WARNED!, you must re-enter connection strings and stored credentials in the reports and shared data sources after this.
Refer to the follow technet article BEFORE doing this.


Unfortunately I couldn't get the Initialization to change from x to tick. So I reinstalled SSRS and went through the Reporting Services Configuration Manager step by step to reconnect SSRS services to the SSRS database. At database setup it moaned about existing keys and the Initialization icon stayed as an X. So I went to the Encryption Keys step and deleted keys. At first this didn't look like it had improved things as the Initialization still appeared to have failed, but a refresh updated it to a tick.

Friday, August 3, 2007

Business Scorecard Manager development notes

It's a typical requirement to have a scorecard that needs regular updating to refer to a new current period. It's also typical that the database contains data for future periods which means Time Intelligence can't be used to get the scorecard to dynamically update itself. So what options does MOBSM give us for programmatic control?

For the scorecard, all the necessary objects are exposed as per the sample code which updates the current and previous members for all kpis in all scorecards.

IBpm pmServer = PmService.CreateInstance(BsmServerUrl); ScorecardCollection scorecards = pmServer.GetScorecards();
for (int i = 0; i <>
{
Scorecard scorecard = scorecards[i];
{
KpiCollection kpis = pmServer.GetKpisFromScorecard(scorecard.Guid);
for (int j = 0; j <>
{
Kpi kpi = kpis[j];
KpiMeasure kpiactual = kpi.Actual;
kpiactual.TimeIntelligenceCurrentMember = new Member(specify member here);
kpiactual.TimeIntelligencePreviousMember = new Member(specify member here);
pmServer.CreateKpi(kpi);
}
}

}


Unfortunately it's not quite so easy to update the periods displayed in chart report views. The Kpi.ReportView.CustomData attribute is an xml string that contains the report view definition, and lurking within this is more encapsulated xml containing the chart definitions. By the time you're finished you will know xml traversal like the back of your hand (and if you have the same attitude to xml as me, you will like the back of your hand a LOT better!).

Addendum:
I realised that I'd be a lot better off adding some calculated members and hierarchies to the time dimension and referring to these instead of updating the current and previous time intelligence members for the scorecard and traversing the chart's IncludedMembers.

The techniques described above are more appropriate for dynamic update based on user selection than as part of a period end process.

Wednesday, August 1, 2007

SSAS development notes

When the data source has changed, use menu option Data Source View:Refresh to update the dsv.

After creating dimensions, open the Cube definition and right-click in the Dimensions tree to add the new dimensions.

Deployment error:Failed trust relationship between workstation and server
http://prologika.com/CS/blogs/blog/archive/2007/06/11/trust-relationships.aspx
Fix: Remove all old domain references from roles

If you're having problems getting changes to display in your cubes, reprocess and redeploy absolutely everything. Reconnect and refresh the cube browser too.

I updated the solution deployment target to a different server but it kept reverting back to the previous target. Sourcesafe informed me that the only file that changed when I updated the deployment info was the .suo file. Further reading on the web suggests that source should be managed from within Visual Studio only, not via SourceSafe, and things are more likely to work properly.

Some guy wrote in a newsgroup:

Figured out how to get the members to appear in date order - it's not as simple as setting attribute orderby properties. The steps I took are as follows:

open the time dim
dimension:add business intelligence
specify attribute ordering
for my mmm-yy attribute, select ordering attribute = new attribute
select a suitable ordering attribute e.g. one with date data type
new attribute appears: "ordering mmm-yy onwhateverorderingattributeyouselected"
next and finish

Hope that helps somebody
David

"David Norman" wrote:
> If I have date data 2005-01-01, 2005-02-01, 2005-03-01 in Excel and use this
> as the X axis for a chart, I can format it as "mmm-yy" and the axis will> display Jan-05, Feb-05, Mar-05.
>
> How do I achieve the same result using the OWC chart component and cube data> source?
>
> If I drag the date dimension level from the field list to the category axis
> containing the same data, I don't have the equivalent date formatting options.
>
> If I create a named calculation Left(Month,3)+'-'+Year then the members list
> in alphabetical order instead of date order i.e. Feb-05 appears before Jan-05> on the axis.
>
> Any suggestions appreciated,
> David

New ordering attributes can break existing ordering that use the same attribute. It may be necessary to remove all ordering attributes and recreate them together.

Friday, July 27, 2007

Attaching to network printer without directory or dns assistance

If you visit another company's office and connect your laptop to their windows network, you'll find that you aren't a member of their domain which can make a lot of typical office activities a bit more difficult to achieve. Printing is a biggie.

First step is to find out from someone on the domain what the ip address of the printer is. Get them to bring up the Printer Properties dialog and go to the Sharing tab. The printer will have a sharing name. Pinging the sharing name from the command prompt will give you the ip address.

Now you can Add Printer. Here's where is gets a bit counterintuitive: you select Local Printer (not Network Printer - I always thought local meant directly connected to your PC, but there are more options in this wizard path than I realised). Select Create a New Port and choose Standard TCP/IP Port. Enter the IP address (the port will be completed for you).

Then you have to select the Manufacturer and Model. My experience has been that if I choose something in the ballpark then plain old document printing will usually work. e.g. the model HP Laserjet 4000 Series PS provides printing for a HP Laserjet 4345 mfp printer, but its other fancy features may be unavailable.

The guidance out there on the web is to choose PS first followed by PCL6 followed by PCL if confronted with multiple similar choices.

SSIS development notes

Mirror any settings you change in the package in your development package configuration. Otherwise you risk the config values blatting your changes.

Click OK to save transformation changes, don't just close the editor!

When tidying up the dtsConfig xml formatting, don't put newlines and tabs between the ConfiguredValue tags.

This error:

Error at Create x Table [DTS.Pipeline]: The index is not valid.ADDITIONAL INFORMATION:Exception from HRESULT: 0xC0048004 (Microsoft.SqlServer.DTSPipelineWrap)

just means that SSIS is playing up. Recreate the offending data source (I wonder if any of the service packs fix this?)

When copying and pasting data flow elements, unselect the copied elements before pasting.

You've added a dozen package variables in the wrong scope, and the scope can't be edited in the variable view. Aaargh! Do you have to delete and recreate them? No, you can open the code designer view of the package, find the xml for the variables and move them to the appropriate place. It's not really much easier I'm afraid.

When columns are added to the data source, the Union All transformation fails to recognise their existence. This transformation has to be recreated.

If you're developing on a virtual machine that has been created by restoring from a windows backup, the first tab of the Regional and Language Options control panel may revert back to US English. Changing all the settings on the front tab will get your usual desktop apps behaving properly, but SSIS may still be confused! If you have a SQL task that calls a stored procedure with a date parameter, you may find that the sp will receive a date in the wrong format. The way to fix this is to select the Default User Account Settings tick box on the Advanced tab of the Regional and Language Options control panel.

When executing solutions or packages within Visual Studio I get duplicate designer windows opening up. The icons for these dupes are the same as for miscellaneous files. The main nuisance is if after execution I decide to make further changes and I mistakenly change the dupes instead of the originals. What are these windows for?

Thursday, July 26, 2007

Restoring windows backup didn't restore all regional and language options

Strange, restoring into a vanilla os from a backup file didn't bring across the regional settings on the first tab of Regional and Language Options.

I only noticed this because I hit a documented SSIS error.

Wednesday, July 25, 2007

SQL Server 2005, Visual Studio 2005 and Visual Studio 2005 SP1

If you have installed both SQL Server 2005 and Visual Studio 2005 then probably both of the following will be installed:

Microsoft Visual Studio 2005 Premier Partner Edition - ENU
Microsoft Visual Studio 2005 X Edition - ENU

Sure it's untidy that the VS2005 X install doesn't detect and replace Premier, but it hasn't caused any problems (so far). If you're like me, you haven't dared remove Premier because you're worried about regression, and probably you just use X because it is a superset of Premier anyway.

However now it's time to put Visual Studio 2005 SP1 on. What's it going to do?

It asked me if I want to install on Premier. But I don't use Premier! Oh no, it's chosen the wrong one! I cancelled the installation and to my surprise it then asked if I wanted to install on X? Phew! But now I'm not sure if I need to install on Premier also. Considering it's supposed to take a minimum of 90 minutes, do I really want to sit through that twice?

I decided to install only on X, and hit a "digital signature check" error. This post describes the fix for a software stack that's very similar to mine.

More P2V travails

Before restoring from backup I changed the registry for intelppm.sys as per previous posting. The settings didn't revert on restore so the stop error went away.

Since stop errors that require registry fixes are a nuisance to sort out, it's tempting to preemptively disable intelppm.sys and processr.sys as part of the basic steps, immediately after firing up the restore vmc.

The runtime error I posted about previously was solved by repairing dotnetframework2.0. Not really sure why it broke. Probably for the same reason I was prompted for the dotnetfx install during the Win2K3 repair.

Tuesday, July 24, 2007

Better (but probably not best) practice for Windows/Virtual PC P2V

Basic steps:

Back up the physical system to a bkf file
Create a vanilla vhd using the operating system of the physical system
Make a copy of the vanilla vhd to a restore vhd
Create an empty backup vhd
Mount the backup vhd using vhdmount
Copy the bkf file into the backup vhd (might need to compact or split it)
Unmount the backup vhd
Configure a vmc with restore vhd as disc 1 and backup vhd as disc 2
Fire up the vmc and restore the backup
Reboot, cross fingers and troubleshoot device issues

If there are issues:

Configure another vmc with the vanilla vhd as disc 1 and restore vhd as disc 2

Fire up the vmc and use windows tools to copy required files, edit offline registry etc

Don't faff around with recovery console or utility boot discs more than necessary - it's all a big pain and you'll end up having to juggle various iso's and vfd's.

Monday, July 23, 2007

Getting big files into vhds via vhdmount

Having trouble using or copying big files from shared folders or linked drives because of throughput or memory constraints?

Mount a disc ,and copy in your files, then unmount and commit changes. Vhdmount then takes a loooong time merging undo discs, but at least it works!

Windows explorer integration for vhdmount helps out a lot with this.

I find this a great way to get image or backup files into blank vhds. I partition with the appropriate windows setup cd and then do a quick-format as prompted on mounting the vhd, followed by the file copy.

My experience is that if the file is really big (say 10GB) then at the end of the copy it will sit there saying "5 seconds remaining" for a long time. I haven't been patient enough to let it sit for a day to see if it completes, but there's very little activity of any kind so I think it just dies.

Resetting recovery console password and mounting vhd

Instructions at petri.co.il and http://support.microsoft.com/kb/322672.

Didn't work with login used due to error: 0x32 Request not supported.

Googling for this error returned this page which unfortunately is only available in Italian. Excerpt:

Attenzione: non è possibile modificare la password di DSRM mediante l’utility NTDSUTIL nel mentre si opera in DSRM (Errore: “Setting password failed. Win32 Error Code: 0x32. Error Message: The request is not supported.”). Viceversa è possibile farlo da DSRM mediante la sequenza di tasti Control+Alt+Del e cliccando sul bottone “Change Password...” oppure inserendo il comando seguente: “net user Administrator *”.

Babelfish translated this as:

Attention: Error is not possible to meanwhile modify password of DSRM by means of utility the NTDSUTIL the work in DSRM (: "Setting password failed. Win32 Error Tails: 0x32. Error Message: The request is not supported."). Viceversa is possible to make it from DSRM by means of the sequence of Control+Alt+Del keys and cliccando on the button "Change Password..." or inserting the following commando: "net user Administrator *".

In the end I didn't need to figure this out because loginrecovery.com succeeded in deciphering the local administrator password.

I put a load of command line registry editing tools on a disc and mounted it on the guest. Unfortunately they all page faulted - they're obviously pretty old and haven't been updated to latest windows versions. I'm really surprised that I couldn't find a registry editor that can be used in the recovery console.

So then I decided to try and mount the vhd on the host so I could use the host's regedit to manually fix the guest's registry file.

I downloaded and installed vhdmount (part of Virtual Server) and from the command line I did the following:

cd c:\program files\microsoft virtual server\vhdmount
vhdmount /m /f vhdfile.vhd V:

which gave me the error "Invalid command line parameters."

I then tried just plugging in instead of mounting

vhdmount /p /f vhdfile.vhd

which resulted in error "Failed to plug in the Virtual Hard Disk (VHD). The specified file is not a valid VHD."

I then typed SET VHD_MOUNT_TRACE=1 and tried this command again which gave me:

[Info] Default user locale is 0x1409[Warning] Dictionary VhdMountStrings.dll doesn't exist.[Info] GetSCSIAddress called for VHD: vhdfile.vhd[Info] Verifying file: vhdfile.vhd[Error] Query failed: 0x57[Error] GetSCSIAddress failed. error code = 0x37[Info] MountVHD called for VHD: vhdfile.vhd[Info] Verifying file: vhdfile.vhd[Error] Invalid VHD signature found: 0x00[Warning] Footer of the given VHD is incorrect. Hence checking the VHD's header
The Virtual Hard Disk is successfully plugged in as a virtual disk device.[Info] VHDMount successful.


"Successful", huh. VHDMount is obviously very pleased to come anywhere near my vhd without crashing itself.

The vhd is split so perhaps this explains incorrect footer. I then tried to fire up Virtual PC to compact the vhd, but it hanged on the splash screen. So now I know what VHDMount was successful at: giving itself exclusive access to the vhd without doing anything useful! I had to kill VPC, unmount the vhd and start VPC again.

Thursday, July 19, 2007

P2V dramas with Virtual PC and NTBackup

I've been following the instructions in this discussion group posting with the exception that I'm using Virtual PC.

Installing a vanilla os on to a vpc is obviously no big deal. But it got harder sooner than I would have liked - restoring the bkf file. I have hit the following problems:

I couldn't restore from a a shared folder due to known issues.

I tried to copy the bkf file from a shared folder to the guest c drive but I kept getting a device i/o error. It appears that the host needs to have almost as much memory available as the size of the file being copied. Obviously this is problematic for bkf files that can be in the multi gigabyte range.

I then split the bkf file into multiple parts with winzip and minimised the amount of guest memory to maximise the chances of the copy completing successfully.

Finally I could restore the bkf file!

However when I restarted, I got an error box at the login screen as follows:

Microsoft Visual C++ 6.0 program

C:\Prog...

This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information.

Annoyingly this error box was right in front of the login box and I couldn't move the error box or close it (the mouse stopped working).

I tried entering login credentials blind without success (couldn't see what domain was selected and couldn't see the contents of any dialog boxes that popped up).

Finally I rebooted the image in the hope the error or the mouse failure would go away but I got a boot error on a black screen instead:

Windows could not start because of an error in the software.
Please report this problem as :
load needed DLLs for HAL.
Please contact your support person to report this problem.

Aha! Here was something referred to in the instructions. I booted from the Win2K3 CD and went to the recovery console to replace the hal file but found that it wouldn't take the current administrator password.

It turns out that the recovery console admin password will be the one used when Win2K3 was initially installed, and subsequent password changes don't affect this. I don't have that password and had to use http://www.loginrecovery.com/ to get the password back.

While I was waiting 48 hours (to get the password for free) I impatiently tried to repair the server using the Win2K3 CD to see if that would sort things out.

A required file called cpqteam.hlp could not be found. Who requires a help file?? Cancelled.

The next one was a bit tougher to figure out: install.exe. The default path on the c drive was c:\windows\microsoft.net\v2.0.50727\Microsoft .NET Framework 2.0. So I downloaded dotnetfx.exe, extracted the files, converted them to iso, captured the image as a cd and selected the file.

The repair chugged on a bit further and then crashed dramatically with stop error 0x000000CE DRIVER_UNLOADED_WITHOUT_CANCELLING_PENDING_OPERATIONS for file intelppm.sys.

Virtual PC Guy reckons I should disable the driver in the registry but with a non booting guest and no access to recovery console (at least until I hear from loginrecovery) I think I've hit the wall...

Tuesday, July 17, 2007

More P2V info from some forum

“The Path To Fast P2V” – The Quicksilver Project



We all have tons of physical machines we want to migrate to Virtual Hosts, and we all want to do it in a smooth low administrative cost process.

Well, all we have to do is to use the VSMT right? (Virtual Server Migration Toolkit). The problem is that its too complex to use it, and in the end the hours you spend to run all the VSMT process is maybe even bigger than the one you would take to set it up the hole thing again without VSMT at all!!

This is what I came up after some small researching.

I Managed to P2V in 2 Single Steps, and I would like to share my experience with you.

Suppose you have a nice Windows 2003 Sp1 Machine running on a old Server, and you want to P2V it to Virtual Server. Simply you use your cloning solution of you choise (preferably I use Acronis and Ghost), and deliver the image to the Virtual machine as it was a physical one.

Logical when the Virtual Machine boots you get a “Blue Screen of Death” because of the hardware Incompatibility.

All you have to do is to boot up the Virtual Machine with the Windows 2003 Sp1 CD, choose the R option to go to the Recovery Console, when you just get the famous “Dos Like Console”

This is the amazing part.. you copy from the CD of Windows 2003 Sp1 the hal.dll and ntoskrnl.exe and override in the Windows system32 directory.

In simple Commands you do as follows..

1.boot the Virtual machine with the cd of Windows 2003 SP1

2. Press R for the Recovery Console

3. choose your Windows Installation (usually option 1)

Go to windows system32 folder by typping cd c:\windows\system32

Then enter the drive letter of cd-rom (usually dJ

At d: prompt type cd i386

Then type as follows

Expand ntoskrnl.ex_ c:

Expand hal.dl_ c:

For each command you will take a notification for overwriting the original files.. we choose yes to replace the files.



Then rebout the machine, remove the cd-rom of window 2003 sp1, let the machine boot and press F8 to go to option menu for Safe Boot.

THERE YOU ARE, THE TARGET MACHINE IS BOOTING WITH NO ERRORS.

IT WILL AUTO DISCOVER ALL THE HARDWARE AND IT WILL PLAY OK



100% TESTED, MIGRATED MANY MACHINES LIKE THAT.



PLEASE FEEL FREE TO CONTACT ME FOR ANY INFORMATION YOU MAY NEED.



BEST REGARDS.



ALEXOPOULOS IOANNIS MCSE MCT MVP Candidate.

Ioannis_alexopoulos@hotmail.com (my msn account, feel free to add me)

P2V info from microsoft.public.virtualserver

1. Sylvain Lafontaine fill the blanks, no spam please
View profile
More options Mar 18, 5:40 am
Newsgroups: microsoft.public.virtualserver
From: "Sylvain Lafontaine"
Date: Sat, 17 Mar 2007 13:40:47 -0400
Local: Sun, Mar 18 2007 5:40 am
Subject: Re: Migrating a physical server as VM
Reply to author Forward Print Individual message Show original Report this message Find messages by this author
Yes, there are other tools to migrate a physical machine to a virtual. However, it's not always easy and with some of these tools/methods, you will have to make a reparation step to replace the HAL layer before Windows will be able to boot under its new virtualized environment. The ACPI set up in the BIOS is also another thing that you must check up before making the P2V migration:
http://www.rtfm-ed.co.uk/docs/vmwdocs/whitepaper-upgrading-cpus-on-no...
http://blogs.msdn.com/virtual_pc_guy/archive/2004/11/24/269412.aspx http://blogs.msdn.com/virtual_pc_guy/archive/2004/11/30/272662.aspx http://blogs.msdn.com/virtual_pc_guy/archive/2004/12/10/279667.aspx
Here are some other automated tools or manual methods for doing your P2V (Physical to Virtual) migration:
http://4sysops.com/archives/p2v-for-vmware-migrate-physical-to-virtua...
http://www.invirtus.com/content/view/808/394/
http://www.acronis.com/enterprise/products/ATISWin/universal-restore....
http://www.vmware.com/products/beta/converter/
http://www.rtfm-ed.eu/docs/vmwdocs/whitepaper-ultimatep2v-quickstart.pdf
In the case of VMWare converter, you will have to convert it back to VHD but only after removing the previous VMWare additions; see:
http://vmtoolkit.com/files/default.aspx http://vmtoolkit.com/blogs/paul/archive/2006/12/14/preparing-a-window...
If you want to use an imaging software like Ghost or Acronis, you will have to set up a second virtual environment for the sole purpose of restoring an image directly to a vhd: first you create two empty VHD then you attach them to the virtual environment as second and third virtual hard drives, you copy the image file(s) to the second hard drive and you restore it to the third one. All you have to do now is to detach this last vhd and use it as the main virtual hard drive for creating a new virtual environment. (There are other methods like using a booting DVD or a network connectin but these are often more complicated to set up.)
When using some older imaging softwares, it's also a good idea to partition the target VHD before making the restoration to make sure that the master boot record on the virtual drive has been correctly written or you can take the precaution of imaging the whole physical hard drive which is used as the source instead of just backing/restoring a single partition (and you must do it even if there is only one single partition on it); otherwise the MBR might be incorrectly written on the target virtual drive after the restoration. (Probably that you won't have this problem with a more recent version of these softwares but it's better to be safe then sorry.)
You might have some other problems, for example if your operating system was on the D:\ drive and you want it now to be on the C:\ drive. For a discussion of this, take a look at:
http://help.lockergnome.com/windows/Cloned-Drive-Ghost-10-Log-Drive-f...
http://groups.google.com/group/microsoft.public.windowsxp.hardware/br...
-- Sylvain Lafontaine, ing. MVP - Technologies Virtual-PC E-mail: sylvain aei ca (fill the blanks, no spam please)

Virtual PC P2V and linked disc

Following the process of creating the virtual machine as per KB912826 and starting the ASR Restore to restore the backup to it:

If after pressing F2 to initiate ASR you get the following error:

Setup was unable to restore the configuration of your system because of the following error: The capacity of the replacement hard disk drives is insufficient, and cannot be used to recover the partitions on the original system disk. The replacement hard disk drives must be at least as large as the disks present on the original system.

And you've read KB314686 and tweaked the dynamically expanding disc size but still can't get rid of the error message:

Then you've probably got the vhds the wrong way round! The dynamically expanding vhd is attached to the vm first and the linked vhd is attached second. It's possible to make this mistake if you aren't careful following the instructions in KB912826, because step 5f incorrectly refers to the "expandable virtual disk file that you created in step 3", but it's the linked disc file that's created in step 3.