Showing posts with label Export/Import. Show all posts
Showing posts with label Export/Import. Show all posts

Saturday 25 March 2017

Import Method using IMPDP to apply only incremental rows

Import Method using IMPDP to apply only incremental rows 

Step 1: Take a full backup of emp table

[oracle@oracledb ~]$ expdp system/passwd direcctory=dpump tables=scott.emp dumpfile=emp.dmp logfile=emp.log
LRM-00101: unknown parameter name 'direcctory'

[oracle@oracledb ~]$ expdp system/sys123 directory=dpump tables=scott.emp dumpfile=emp.dmp logfile=emp.log

Export: Release 11.2.0.3.0 - Production on Sat Mar 25 12:11:41 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_TABLE_01":  system/******** directory=dpump tables=scott.emp dumpfile=emp.dmp logfile=emp.log
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
. . exported "SCOTT"."EMP"                               8.562 KB      14 rows
Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
  /u01/emp.dmp
Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 12:11:57

SQL> select empno from emp;

     EMPNO
----------
      7369
      7499
      7521
      7566
      7654
      7698
      7782
      7788
      7839
      7844
      7876

     EMPNO
----------
      7900
      7902
      7934

14 rows selected.

Step 2: Now insert one row into EMP table

SQL> insert into emp values(1001,'sirat','MGR','7369',sysdate,1000,null,20);

1 row created.

SQL> commit;

Commit complete.

Step 3: Now take a incremental backup using QUERY and DATA_ONLY parameter in IMPDP command

[oracle@oracledb ~]$ cat data.par
tables=SCOTT.EMP
directory=dpump
DUMPFILE=emp_incre.dmp
logfile=emp_incre.log
query=SCOTT.EMP:"where empno=1001"
content=DATA_ONLY

[oracle@oracledb ~]$

[oracle@oracledb ~]$ expdp system/passwd parfile=data.par

Export: Release 11.2.0.3.0 - Production on Sat Mar 25 21:32:14 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_TABLE_01":  system/******** parfile=data.par
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
. . exported "SCOTT"."EMP"                               8.031 KB       1 rows
Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
  /u01/emp_incre.dmp
Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 21:32:16

[oracle@oracledb ~]$


Step 4: Delete one row from emp table

SQL> delete from emp where empno=1001;

1 row deleted.

SQL> commit;

Commit complete.

SQL> select empno from emp;

     EMPNO
----------
      7369
      7499
      7521
      7566
      7654
      7698
      7782
      7788
      7839
      7844
      7876

     EMPNO
----------
      7900
      7902
      7934

14 rows selected.

SQL>


Step 5: Apply incremental backup APPEND option in IMPDP command

[oracle@oracledb ~]$ impdp system/passwd directory=dpump dumpfile=emp_incre.dmp logfile=imp1.log table_exists_action=append

Import: Release 11.2.0.3.0 - Production on Sat Mar 25 21:36:31 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_01":  system/******** directory=dpump dumpfile=emp_incre.dmp logfile=imp1.log table_exists_action=append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."EMP"                               8.031 KB       1 rows
Job "SYSTEM"."SYS_IMPORT_FULL_01" successfully completed at 21:36:34

[oracle@oracledb ~]$

Now check Data append in emp table or not

SQL> select empno from emp;

     EMPNO
----------
      1001
      7369
      7499
      7521
      7566
      7654
      7698
      7782
      7788
      7839
      7844

     EMPNO
----------
      7876
      7900
      7902
      7934

15 rows selected.

SQL>


Thats it!!!!!!!!!!!!!!!!!!!!!!!!!

Reference :  http://www.acehints.com/2012/05/datapump-impdp-tableexistsaction-append.html


Tuesday 25 August 2015

EXPDP in Oracle 12c (ORA-39002 ORA-39070 ORA-39087)

DROP DIRECTORY DUMP_DATA;

CREATE OR REPLACE DIRECTORY
DUMP_DATA AS
'G:\InHouse_DB_Backup';



GRANT EXECUTE, READ, WRITE ON DIRECTORY DUMP_DATA TO system WITH GRANT OPTION;

impdp system/sys123 schemas=ORBHRM,ORBWEB,UTILITY directory=DUMP_DATA dumpfile="HRDBRQDB-20150824.dmp" logfile=test.log

C:\Users\Administrator>impdp system/sys123 schemas=ORBHRM,ORBWEB,UTILITY directory=DUMP_DATA dumpfile="HRDBRQDB-20150824.dmp" logfile=test.log

Import: Release 12.1.0.2.0 - Production on Tue Aug 25 14:51:00 2015

Copyright (c) 1982, 2014, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-39087: directory name DUMP_DATA is invalid

Note: In Oracle 12 c during expdp and impdp pluggable Database name must be mentioned

impdp system/sys123@orcl_pdb schemas=ORBHRM,ORBWEB,UTILITY directory=DUMP_DATA dumpfile="HRDBRQDB-20150824.dmp" logfile=test.log

Sunday 17 May 2015

ORA-31693 ORA-02354 ORA-01555: snapshot too old: rollback segment number 12 with name "_SYSSMU12_933907484$" too small in expdp

Problem: Processing object type DATABASE_EXPORT/AUDIT
ORA-31693: Table data object "MICR"."OUTWDCLR" failed to load/unload and is being skipped due to error:ORA-02354: error in exporting/importing data
ORA-01555: snapshot too old: rollback segment number 12 with name "_SYSSMU12_933907484$" too small

select COLUMN_NAME,SECUREFILE,PCTVERSION,RETENTION from dba_lobs where OWNER='MICR' and TABLE_NAME='OUTWDCLR';

COLUMN_NAME SECUREFILE PCTVERSION RETENTION

IMAGE_FRONT NO 10
IMAGE_REAR NO 10


SQL> show parameter undo;

NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
undo_management                      string      AUTO
undo_retention                       integer     900
undo_tablespace                      string      UNDOTBS1
SQL>


SQL> select max(maxquerylen) from v$undostat;

MAX(MAXQUERYLEN)
----------------
            3291

SQL>

We see the retention comes back showing 900 seconds (15 minutes) which is the same as the current UNDO_RETENTION,
but the maxquery length is 3291 seconds.


When the LOB was created, the actual setting for RETENTION was defined by the current setting for UNDO_RETENTION.
This time is not long enough.

Solution:::

 1. Modify the current UNDO_RETENTION for the database:

SQL>ALTER SYSTEM SET UNDO_RETENTION = 4500 scope=both sid='*';

2. Modify the LOB retention to become greater than the undersized retention parameter following the steps from Note:563470.1

SQL> alter table MICR.OUTWDCLR modify lob(IMAGE_FRONT) (retention);
Table altered.

SQL> alter table MICR.OUTWDCLR modify lob(IMAGE_REAR) (retention);
Table altered.


3. Query the lob retention again to verify that the change has taken hold:
SQL> select COLUMN_NAME,SECUREFILE,PCTVERSION,RETENTION from dba_lobs where OWNER=upper('&OWNER') and TABLE_NAME=upper('&TABLE_NAME') ;

COLUMN_NAME                    SEC PCTVERSION  RETENTION
------------------------------ --- ---------- ----------
IMAGE_FRONT                NO                  4500
IMAGE_REAR                 NO                  4500

4. Perform the export again.

Saturday 15 November 2014

Import (IMPDP) Oracle Database backup from network location


Scenario:

Suppose we want to restore backup into destination database server (192.168.10.6) and backup resides in source database server (192.168.10.32)

Solution:

Step1: We need to keep the backup dump file into source database server (192.168.10.32)

Step2: We need to create a database link into source database server (192.168.10.32)

CREATE PUBLIC DATABASE LINK DPUMP
 CONNECT TO SYSTEM
 IDENTIFIED BY
 USING '(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=192.168.10.6)(PORT=1521))(CONNECT_DATA=(SID=orcl)))';

Step3: Now we need to create a directory where database backup will reside 

DROP DIRECTORY data_dump;
CREATE OR REPLACE DIRECTORY data_dump AS '/back/exp/';

GRANT EXECUTE, READ, WRITE ON DIRECTORY SYS.data_dump TO SYSTEM WITH GRANT OPTION;

Step4: Now we need to execute import (impdp) command from source database server (192.168.10.32) to restore
       backup into destination database server (192.168.10.6)

Step5: Put the schemas value which schema you want to import

impdp system/sys123 schemas=scott,hr,test network_link=DPUMP directory=data_dump dumpfile=orcl_full.dmp logfile=orcl_full.log    


  
Step6: Now the data values that you want to import..

Cheers.....  

Wednesday 12 November 2014

How to Prevent ORA-39000 ORA-31640 ORA-27037 Errors When Performing DataPump Export/Import

APPLIES TO:

Oracle Database - Enterprise Edition - Version 10.2.0.1 and later Information in this document applies to any platform.

GOAL:

This article documents a resolution for errors ORA-39000, ORA-31640 and ORA-27037 when performing DataPump export/import. DataPump Import can fail with the following errors:

Import: Release 10.2.0.1.0 - Production on Friday, 30 January, 2009 15:10:33
Copyright (c) 2003, 2005, Oracle. All rights reserved.
;;;
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
ORA-39000: bad dump file specification
ORA-31640: unable to open dump file "/oracle/u01/app/oracle/oracle/product/10.2.0/db_2/admin
/dpdump/expdat.dmp" for read
ORA-27037: unable to obtain file status
Linux Error: 2: No such file or directory
Additional information: 3

SOLUTION:

The parameter DIRECTORY specifies the location to which the DataPump Export or Import  is not properly created or having permission issues.
Drop  and re-create a new oracle directory object and change the expdp/impdp command to point to the new directory to resolve this issue.  You must have the DBA Privilege to create a Directory.
For example to create a directory object named expdp_dir located at /u01/backup/exports enter the following sql statement:

SQL> drop directory expdp_dir;
SQL> create directory expdp_dir as '/u01/backup/exports';
Then grant read and write permissions to the users who will be performing the data pump export and import.
SQL> grant read, write on directory expdp_dir to system, user1, user2, user3;

REFERENCES:  Doc ID 784566.1

DataPump Import (IMPDP) Fails With Errors ORA-39001 ORA-39000 ORA-31640 ORA-27037

APPLIES TO:

Oracle Database - Enterprise Edition - Version 10.1.0.2 to 11.2.0.4 [Release 10.1 to 11.2]
Information in this document applies to any platform.


ERROR SYMPTOMS:

DataPump import fails with the following errors:
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31640: unable to open dump file "/u01/bkups/exports/EXPORT.dmp" for read
ORA-27037: unable to obtain file status
Linux-x86_64 Error: 2: No such file or directory
Additional information: 3

The parameters used are:

userid=system/
DIRECTORY=my_dir
DUMPFILE=EXPORT.dmp
LOGFILE=my_logdir:EXPORT.log
CHANGES

DataPump export with parameters:
userid=username/
DIRECTORY=my_dir
DUMPFILE=EXPORT.DMP
LOGFILE=EXPORT.log
content=metadata_only
VERSION=10.2.0

was successful:
Master table "EXPORT"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for EXPORT.SYS_EXPORT_SCHEMA_01 is:
/spare/clone/EXPORT.DMP
Job "EXPORT"."SYS_EXPORT_SCHEMA_01" successfully completed at 20:41:12

CAUSE:

At first glance, this appears to be an exact match to the note 784566.1 How to Prevent ORA-39000 ORA-31640 ORA-27037 Errors When
Performing Data Pump Export/Import

If you have already read that note, tried the solution and are still getting the errors, then the problem may be with the actual
export.dmp file.

In this case, as you can see from the export par file and import par file the name of the export dump is not the same:

Export: DUMPFILE=EXPORT.DMP
Import: DUMPFILE=EXPORT.dmp

SOLUTION:

Once the import parameter was changed to DUMPFILE=EXPORT.DMP the import completes successfully.


Reference : Doc ID 1228194.1

Sunday 7 September 2014

Export/Import specific tablespace using Data Pump in Oracle 10g Database


1.  Tablespace Export

expdp system/ TABLESPACES=USERS,UNDOTBS1 directory=DATA_PUMP_DIR dumpfile=test.dmp LOGFILE=exp.log    parallel=2  

In the above example, expdp takes a backup of the contents of USERS,UNDOTBS1. The expdp also runs quickly because of the parallel=2 option (provided there are more CPUs available in the database server). 

2. Transportable tablespace

expdp system/ transport_tablespaces=test_user_tbs transport_full_check=y directory= DATA_PUMP_DIR dumpfile=test.dmp logfile=exp.log

Transportable tablespaces export and import is manageable across platforms and only Meta data will be exported. In Cross platform transportable tablespace the data movement is simpler and faster.

This mode requires that you have the EXP_FULL_DATABASE role.

Please note that

1. source and target database must use the same character set/national character set
2. You cannot transport a tablespace to a target database which already exists.
3. Transportable tablespace exports cannot be restarted once stopped
4. Target database must at same or higher release level as the source database.

Transportable tablespace export and import on same endian platforms 
Step 1: Find the Operating system byte order on Source and Target Database
SQL > select * from v$transportable_platform order by platform_id;

1 Solaris[tm] OE (32-bit) Big
2 Solaris[tm] OE (64-bit) Big
3 HP-UX (64-bit) Big
4 HP-UX IA (64-bit) Big
5 HP Tru64 UNIX Little
6 AIX-Based Systems (64-bit) Big
7 Microsoft Windows IA (32-bit) Little
8 Microsoft Windows IA (64-bit) Little
9 IBM zSeries Based Linux Big
10 Linux IA (32-bit) Little
11 Linux IA (64-bit) Little
12 Microsoft Windows 64-bit for AMD Little
13 Linux 64-bit for AMD Little
15 HP Open VMS Little
16 Apple Mac OS Big
17 Solaris Operating System (x86) Little

18 IBM Power Based Linux Big


3. Tablespace Import

impdp system/ TABLESPACES=USERS directory=DATA_PUMP_DIR dumpfile=test.dmp LOGFILE=imp.log

Above example imports all tables that have data in tablespaces USERS and it assumes that the tablespaces already exist.















Tuesday 24 June 2014

ORA-39068 ORA-01950: no privileges on tablespace ORA-39097: Data Pump job encountered unexpected error -1950

Error:

ORA-39006: internal error
ORA-39068: invalid master table data in row with PROCESS_ORDER=-3
ORA-01950: no privileges on tablespace 'FCAT_NRB'
ORA-39097: Data Pump job encountered unexpected error -1950

Reason: The user executing data pump job has not privilege on "FCAT_NRB" tablespace

Solution: Please execute the following privilege and try to execute the data pump job again.

alter user NRBAPP_NRB quota unlimited on FCAT_NRB;

Please try again...and buzz me please if required anything..

Tuesday 10 June 2014

Export/Import Specific Table in Oracle

Export/Import Specific Table in Oracle 

Step1: Export specific table/tables backup using following command

exp /password@db_name tables=  file=D:\cnsstnt.dmp  statistics=NONE log=D:\cnsstnt.log

Step2: Import specific table/tables backup using following command

imp /password@db_name fromuser= touser= tables=  file=D:\cnsstnt.dmp  statistics=NONE ignore=Y log=D:\cnsstnt.log

Cheers....

Monday 26 May 2014

Export Oracle Database to Network Location using Data Pump

Export Oracle Database to Network Location using Data Pump

Step1: Create a public database link from where the database reside

DROP PUBLIC DATABASE LINK DBDUMP;

CREATE PUBLIC DATABASE LINK DBDUMP
 CONNECT TO SYSTEM
 IDENTIFIED BY
 USING '(DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=192.168.56.2)(PORT=1525))(CONNECT_DATA=(SID=ORCL)))';

Note: 192.168.56.2 is the Database server which will be backed up

Step2: Create a directory where backup command will execute and grant proper permission which user will take backup

DROP DIRECTORY DATAPUMP_CBS;

CREATE OR REPLACE DIRECTORY 
DATAPUMP_CBS AS 
'/u01/ORCL_backup/';

GRANT EXECUTE, READ, WRITE ON DIRECTORY SYS.DATAPUMP_CBS TO ORBHRM WITH GRANT OPTION;

Step3: Export backup command 

expdp /passwd full=y  network_link=DBDUMP directory=DATAPUMP_CBS dumpfile=ORCLFULLDB-$(date +%Y%m%d) logfile=ORCLFULLDBLOG-$(date +%Y%m%d) EXCLUDE=STATISTICS CONTENT=ALL

Monday 21 April 2014

Import Oracle 11gR2 backup (specific users) to Oracle 10gR2

Import Oracle 11gR2 backup (specific users) to Oracle 10gR2 

1. Export full database backup

exp /@SID full=y file=/home/oracle/stelar.dmp STATISTICS=NONE

2. Drop specific user which need to import

DROP USER ATMUTL CASCADE;
DROP USER BACH CASCADE;
DROP USER BACHINT CASCADE;
DROP USER BNSUSER CASCADE;
DROP USER FIUINT CASCADE;
DROP USER ISLBAS CASCADE;
DROP USER ISLIMG CASCADE;
DROP USER ISLITS CASCADE;
DROP USER ISLSYS CASCADE;
DROP USER MYBANK CASCADE;
DROP USER ORBBBR CASCADE;
DROP USER ORBITS CASCADE;
DROP USER SBLMIS CASCADE;
DROP USER SMSGTWAY CASCADE;
DROP USER STFOREX CASCADE;
DROP USER STLBAS CASCADE;
DROP USER STLIMG CASCADE;
DROP USER STLSYS CASCADE;
DROP USER WEBADMIN CASCADE;

3. Unrar backup rar file

unrar e Stelar_31102013.rar

4. Import specific database users

imp user/passwd file=/back/export_backup/Stelar_Thu.DMP fromuser=MYBANK,STLSYS,ORBITS,BACH,BACHINT,ORBBBR,FIUINT,ISLBAS,SMSGTWAY,ISLITS,ISLIMG,WEBADMIN,BNSUSER,STLBAS,ATMUTL,STLIMG,STFOREX,ISLSYS,SBLMIS touser=MYBANK,STLSYS,ORBITS,BACH,BACHINT,ORBBBR,FIUINT,ISLBAS,SMSGTWAY,ISLITS,ISLIMG,WEBADMIN,BNSUSER,STLBAS,ATMUTL,STLIMG,STFOREX,ISLSYS,SBLMIS

Monday 4 November 2013

Export/Import Specific Schema in Oracle

Note: Backup taken in Oracle 10g

exp system/@STLBAS owner=FIUINT file=/home/oracle/fiuint.dmp STATISTICS=NONE

Note: Backup restore in Oracle 11g

imp system/ file=/home/oracle/fiuint.dmp fromuser=FIUINT touser=FIUINT

Wednesday 18 September 2013

export/import backup using batch file in windows


export/import backup using batch file in windows


1. create a batch file named in exp_schema.bat for specific schema

exp stlbas/stlbas@STLBAS_32 owner=RASHID file=D:\cnsstnt.dmp STATISTICS=NONE

1. create a batch file named in exp_table.bat for specific table

exp stlbas/stlbas@STLBAS_32 tables=(ACFGENFD) file=D:\cnsstnt.dmp STATISTICS=NONE

cheer....

Import Dump from Windows to Linux


Import Dump from Windows to Linux in SBL


1. Download the following rpm and install

[root@stnprod2 rpm]# rpm -Uvh unrar-3.8.2-1.el3.rf.x86_64.rpm
warning: unrar-3.8.2-1.el3.rf.x86_64.rpm: Header V3 DSA signature: NOKEY, key ID 6b8d79e6
error: failed to stat /mnt/test: Input/output error
Preparing…                ########################################### [100%]
1:unrar                  ########################################### [100%]

2. Send the dump file to Linux Server.

3. Unrar the dump file using following command

[root@stnprod2 stelar_back]# unrar e  Stelar_23072013.part01.rar
UNRAR 3.80 beta 2 freeware      Copyright (c) 1993-2008 Alexander Roshal
Extracting from Stelar_23072013.part01.rar
Extracting  Stelar_Tue.DMP                                            23%
Extracting from Stelar_23072013.part02.rar
…         Stelar_Tue.DMP                                            47%
Extracting from Stelar_23072013.part03.rar
…         Stelar_Tue.DMP                                            71%
Extracting from Stelar_23072013.part04.rar
…         Stelar_Tue.DMP                                            95%
Extracting from Stelar_23072013.part05.rar
…         Stelar_Tue.DMP                                            OK
All OK
[root@stnprod2 stelar_back]# ll
total 70739316
-r-sr-sr-t 1 root root  4414504960 Jul 24 23:49 Stelar_23072013.part01.rar
-r-sr-sr-t 1 root root  4414504960 Jul 25 00:02 Stelar_23072013.part02.rar
-r-sr-sr-t 1 root root  4414504960 Jul 25 00:12 Stelar_23072013.part03.rar
-r-sr-sr-t 1 root root  4414504960 Jul 25 17:22 Stelar_23072013.part04.rar
-r-sr-sr-t 1 root root   922800911 Jul 25 17:25 Stelar_23072013.part05.rar
-rw-r–r– 1 root root 53785440256 Jul 23 04:28 Stelar_Tue.DMP
[root@stnprod2 stelar_back]#

4. create batch file and script to execute from Windows

–script.bat
:: Open a Telnet window
start telnet.exe 192.168.10.9
:: Run the script
cscript F:\batch\SendKeys.vbs
–SendKeys.vbs
set OBJECT=WScript.CreateObject(“WScript.Shell”)
WScript.sleep 50
OBJECT.SendKeys “oracle{ENTER}”
WScript.sleep 100
OBJECT.SendKeys “oracle{ENTER}”
WScript.sleep 50
OBJECT.SendKeys “cd /back/stelar_back{ENTER}”
WScript.sleep 50
OBJECT.SendKeys “imp system/sys123  file=/back/stelar_back/Stelar_Tue.DMP full=Y{ENTER}”
WScript.sleep 50
OBJECT.SendKeys ” ”

5. execute the batch file