Tuesday 19 March 2013

Generic advice about data pump

I've done a lot of data pumping lately with upgrades and other projects.  It's a great utility, but there are always some teething problems / lessons.

Firstly, the estimate is generally fairly good, so if you are getting an estimate of 110GB and you are expecting 30 - you probably have something wrong.

Secondly the use of INCLUDE and EXCLUDE is great with JDE, as this is a poor mans parallelism (standard edition cannot parallelise imports impdp), so knowing your large tables can allow you to easily perform quasi parrallelisation.  I quite often do F0911 separately (half the database) and then the rest of the tables.  I do this is an exclude and an include.

Thirdly I generally use par files, I like them because I can use them again.  It's the developer in me that has me wanting to use any code more than once.  It's also the old man in me that forgets things days after I've done them.

Forthly - well I'm sick of starting paragraphs with a [number]ly, so I'll just type.  I did try and write my data to a shared USB drive on another computer (out of my domain).  This proved to be impossible.  expdb seems to create a new proc which is like a fork from the main oracle process, therefore inheriting all of the permissions from the main executable.  This means that it cannot see the shares that I've manually authenticated to in my current windows session and therefore unable to natively write to the share.  Despite the fact that if I create the database directory through EM, and it prompts me for a username and password, I cannot use this dir in the data pump.  I know that I could start the DB with the OS user that has the permissions to the share, but it's another domain - os that ain't gunna work...  That was a timeconsuming and painful little lesson - thanks datapump.

The main commands are expdp and impdp

quite simply I run the command as a non sysdba user with a par file.

so expdp system/PassW0rd@e1prod PARFILE=proddta.exp

Nice, and the contents of proddta.exp:

DIRECTORY=DATA_PUMP_DIR
DUMPFILE=PRODDTA.DMP
JOB_NAME=PRODDTATDUMP
SCHEMAS=PRODDTA
LOGFILE=proddtadump.log
EXCLUDE=TABLE:"IN ('F0911', 'F43199')"
COMPRESS=Y
DIRECT=Y
FEEDBACK=1000000

So that is saying grab all of the tables in PRODDTA excluding F0911 and F43199.  Bang them into dump file PRODDTA.DMP.  From a DB perspective, use a job called PRODDTATDUMP and log the operation in file LOGFILE in the DIRECTORY.

It's that simple.

Have another file essentially the same, except us INCLUDE , not exclude - simple!





Friday 15 March 2013

The day that oracle flashback saved my life

Well, not really.  It did not actually save my life.  I had backups and I had a backup of the backup – but flashback was pretty good in this 18 million row table!

I thought I was running “SELECT COUNT(1) from proddta.F43199”, but it seems that I ran “DROP TABLE PRODDTA.F43199”…

Okay, yes…  Message returned from SQL developer confirms my mistake.

Panic sets in, although I have a distant memory of the concept of flashback

I perform a quick google and then courageously  execute:

flashback table proddta.f43199 to before drop ;

Then I get confirmation “table PRODDTA.F43199 succeeded” – YES!!!!

I also have my 18 million records.

Thanks oracle – brilliant

F98811 TC failing for upgrade table conversions

How many table conversions between XE and 9.1?  725!  Wow, that is a large list of tables.

I’ve done enough upgrades to know that the F98811 fails every time.  Easy to fix?  YES!  Just go to bv in the planner environment and find R89811

double click the only version and choose properties.

In the environments from and too, make sure that you select the environment that you are upgrading (For me PD910)

You can then just run it (locally) and it will work.  When you go back to the TC upgrade, you can run it again.  It’ll work, or it’ll probably tell you table is already in target format. Great!  continue upgrading the resy of your workflow control tables.

Thursday 14 March 2013

media objects, FTP and annoying names

Getting the e1 web client to use FTP for media objects is fairly easy, unless you think about it too much.  Or perhaps if you read the doco too carefully.  I’m going to help you out here.

I was getting the following in my logs after setting things up correctly (what I thought was correctly)

14 Mar 2013 15:55:13,738[SEVERE][JDE][BASE]FTPConnection: Unexpected reply 550 The system cannot find the path specified. 
14 Mar 2013 15:55:13,739[SEVERE][JDE][JAS]JDEOWDirect.saveFileFTP(): Invalid FTPConnection exeption is caught Host Name: jdedep2, User Name : anonymous, File path : /E910/MEDIAOBJ/HTMLUpload/FILE-192-168-204-120-43584667690065015-1363236908592.txt
14 Mar 2013 15:55:13,740[SEVERE][JDE][JAS]Exception occured in the JDEOWDirect.mediaObjectSaveFile(): , Source File Name :/u01/app/oracle/Oracle/Middleware/user_projects/domains/mydomain/servers/JDE91DV/stage/JDE91DV/app/webclient.war/moqueue\jdedep2\E910\MEDIAOBJ\HTMLUpload\FILE-192-168-204-120-43584667690065015-1363236908592.txt, Target File Name :\\jdedep2\E910\MEDIAOBJ\HTMLUpload\FILE-192-168-204-120-43584667690065015-1363236908592.txt java.io.IOException: FTP connection failed
14 Mar 2013 15:55:13,740[WARN][JDE][RUNTIME]FTP File upload to the server failed. Please contact your administrator.\nUpload the selected file: [BHP_PBS_ConversionScripts.txt]  \nUploaded file name: [//jdedep2/E910/MEDIAOBJ/HTMLUpload/FILE-192-168-204-120-43584667690065015-1363236908592.txt]

 

when you create the ftp site for your media objects, it does not point to the E910 dir like the doco suggests, it’s the base install dir, as below:

image

Note also that now the new installs use a dir like E910_1, this is a pain.  The share that is created by the install process is called E910, not E910_1.  Therefore you cannot share media objects between fat and thin unless you create another share called E910_1.

So, your P98MOQUE needs to look like:

image

See how I needed to change E910 to E910_1, this is because the automated FTP is a bit slow, and essentially executes a “cd” command which needs to goto a dir name, not a share name.

So, with anonymous access for ftp set up on my deployment server and the above two basic configurations – media objects are working like a peach.

Remember that we at Myriad have a solution which injects sharepoint into the iframe that holds media objects – No more dumb media object storage – we are using version controlled sharepoint.

Tuesday 12 March 2013

AS/400 to oracle (via msaccess) using Oracle SQL Developer. Data conversion

I loaded msaccess with some bespoke tables so that I could copy this to oracle, but how…

Loading is simple, just import data, choose ODBC. Make sure that you create a system naming convention ODBC data source so that access can see the list of tables.

Use SQLDeveloper 3+! Do not use the older versions, they do not have the options that you need for granular table selection.

Get it from here

http://www.oracle.com/technetwork/developer-tools/sql-developer/downloads/index.html

Make sure that you have jdk 1_06.35, or it ain’t going to work…

Expand the zip contents and run the sqldeveloper.exe bin in the \sqldeveloper\bin dir (der!)

Try a connection from SQLDeveloper to your access database

clip_image002

clip_image004

If you get “No read Access to system Tables. Modify Access db before retrying” – you’ll need to change the DB permissions and options.

Now alter the permissions for your database.

I did this in 2000

First, open the DB and goto tools-options and ensure system objects is checked

Then goto tools->security->user and group permissions

Choose users and users radio button and then give (all) appropriate users “read data” and “read design”. Do this for all Msys* tbales.

clip_image006

See above

Do that for all 5 system tables and then do the same for the groups (this is what I did).

Exit the DB

Then do the connection again via SQL Developer

clip_image008

Terrific!

You see your tables and access databases listed.

You select your tables in access and right click and choose copy.

Make sure you have an oracle connection defined as the owner of the schema that you want the data in (in my case, PRODDTA).