Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

Re: Delete Syntax in MLOAD

$
0
0

Hi,
I need to delete the rows from table in mload script which are not matching in the file.
Tried couples of ways but couldn't succeed.

DELETE FROM Employee WHERE EmpNo <> :EmpNo and EmpName <> :Empname;

UTY0805 RDBMS failure, 3537: A MultiLoad DELETE Statement is Invalid.

DELETE FROM Employee WHERE (EmpNo,EmpName) NOT IN (:EmpNo,:Empname);

UTY0805 RDBMS failure, 3707: Syntax error, expected something like
     a 'SELECT' keyword or '(' or a 'NONTEMPORAL' keyword or 'AS' keyword or '('
     between '(' and ':'
Please help
Thanks
Balu

Forums: 

Connect to Microsoft SQL Server from Teradata Studio 14.10

$
0
0

Is it possible to use the new Teradata Studio 14.10 to also connect to a Microsoft SQL Server?
In the “New Connection Profile” wizard is listed “SQL Server”, selecting this brings up a menu for “Specify a Driver and Connection Details”, there are however no driver installed. Pressing the “New Driver Definition”, brings up a menu “Spe3cify a Driver Template and Definition Name” menu, here are listed Microsoft SQL Server 2000, 2005 and 2008 bases, but selecting any of those just give the information “Unable to locate JAR/zip in file system as specified by the driver definition: sqljdbc.jar”.
Any help on how to connect to SQL Server from Teradata Studio or any description on how to do this?
Peter Schwennesen 

Forums: 

Starting TPUMP on Windows 2008 R2 fails

$
0
0

Hi,
I have installed Teradata utilities 14.10 on Windows 2008 R2 (64 bit). After I tried to start TPUMP i get the error msg: The application was unable to start correctly (0x000007b). Click OK to close the application. 
Does this mean that the utilities do not run on this platform
 
thanks

Forums: 

Error with Teradata connector for Hadoop with HCAT -> TD fastload

$
0
0

I have an HCategory table:

CREATE TABLE src.t (
   msgtype string
   , ts string
   , source string
   , msgclass int
   , msgtext string
) PARTITIONED BY (device_range string, device_id string);

and a TD table:

CREATE SET TABLE tgt.t ,FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      msgtype VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      ts VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      source VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      msgclass INTEGER,
      msgtext VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      device_range VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      device_id VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC)
PRIMARY INDEX ( ts );

After exporting a HADOOP_CLASSPATH of:

export HADOOP_HOME=/usr/lib/hadoop
export HIVE_HOME=/usr/lib/hive
export HCAT_HOME=/usr/lib/hcatalog
export TDCH_HOME=/usr/lib/tdch

export HADOOP_CLASSPATH=$HIVE_HOME/conf:\
${HIVE_HOME}/lib/antlr-runtime-3.4.jar:\
${HIVE_HOME}/lib/commons-dbcp-1.4.jar:\
${HIVE_HOME}/lib/commons-pool-1.5.4.jar:\
${HIVE_HOME}/lib/datanucleus-core-3.0.9.jar:\
${HIVE_HOME}/lib/datanucleus-enhancer-3.0.1.jar:\
${HIVE_HOME}/lib/datanucleus-rdbms-3.0.8.jar:\
${HIVE_HOME}/lib/hive-cli-0.11.0.1.3.2.0-111.jar:\
${HIVE_HOME}/lib/hive-exec-0.11.0.1.3.2.0-111.jar:\
${HIVE_HOME}/lib/hive-metastore-0.11.0.1.3.2.0-111.jar:\
${HIVE_HOME}/lib/jdo2-api-2.3-ec.jar:\
${HIVE_HOME}/lib/libfb303-0.9.0.jar:\
${HIVE_HOME}/lib/libthrift-0.9.0.jar:\
${HIVE_HOME}/lib/mysql-connector-java.jar:\
${HIVE_HOME}/lib/slf4j-api-1.6.1.jar:\
${HCAT_HOME}/usr/lib/hcatalog/share/hcatalog/hcatalog-core-0.11.0.1.3.2.0-111.jar:\
${TDCH_HOME}/hive-builtins-0.9.0.jar

And using the teradata connector for hadoop command:

hadoop jar /usr/lib/tdch/teradata-connector-1.2.jar
com.teradata.hadoop.tool.TeradataExportTool 
-libjars /usr/lib/hive/lib/hive-cli-0.11.0.1.3.2.0-111.jar,/usr/lib/hive/lib/hive-exec-0.11.0.1.3.2.0-111.jar,/usr/lib/hive/lib/hive-metastore-0.11.0.1.3.2.0-111.jar,/usr/lib/hive/lib/jdo2-api-2.3-ec.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/libthrift-0.9.0.jar,/usr/lib/hive/lib/slf4j-api-1.6.1.jar,/usr/lib/tdch/hive-builtins-0.9.0.jar 
-classname com.teradata.jdbc.TeraDriver 
-url jdbc:teradata://td_server/DATABASE=tgt 
-username myuser
-password mypasswd 
-jobtype hcat 
-method multiple.fastload 
-sourcedatabase src
-sourcetable t
-targettable t

I get the following error:

ERROR tool.TeradataExportTool: java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/HCatInputFormat

I have been playing around with following arguments, but no combination helps right now...

-targettableschema "msgtype VARCHAR(255),ts VARCHAR(255),source VARCHAR(255),msgclass INT,msgtext VARCHAR(255),device_range VARCHAR(255),device_id VARCHAR(255)" 
-targetfieldnames "msgtype,ts,source,msgclass,msgtext,device_range,device_id"
-targetfieldcount "7" 
-sourcetableschema "msgtype STRING,ts STRING,source STRING,msgclass INT,msgtext STRING,device_range STRING,device_id STRING"
-sourcefieldnames "msgtype,ts,source,msgclass,msgtext,device_range,device_id"

 

 

Forums: 

TPT 14.10 error with fifo input files

$
0
0

Hi everyone!!
I have several tpt scripts (which input is a fifo file) that used to run properly in tpt 13.0, now the system in which I'm working is upgrading to 14.10 and the scripts fails...
the error is:
        "TPT19120 !ERROR! multi-instance read feature requires REGULAR file names."
I've tried to run de tpt sample scripts founded in  
/opt/teradata/client/14.10/tbuild/sample/userguide/02b    (your installation path)  this sample show how to run tpt with an input fifo file, but I get the same error:

"TPT19120 !ERROR! multi-instance read feature requires REGULAR file names. 'data/datapipe' is not a REGULAR file."

 

Anyone know what is happening? Are there any parameter I've to define in order to get input data from a fifo file? 

 

I can't find any differences in paper related to this error... and the scripts in 13.0 runs fine!

 

Thanks in advance for your help!!!
 

Forums: 

BTEQ exporting multiple files based on column value in a table

$
0
0

Hi 
Can anyone please give me some thoughts  on this
I have a scenario where I need to create multiple extract files based on a column values using bteq script and in unx environment.
example 

 table abc 

 

C_Name   ID

xxxxx        1

yyyy          1

aaaaa       2

bbbbb       2

ccccc         1

Now i need to create  files based up on ID and the outfile  name should Name_ID.txt (Eg: Name_1.txt).

And Name_1.txt should have 1's Data only...

There are many more columns in the extract but for example i am using 2 columns

 

thanks

Krish

Forums: 

FASTLOAD SCRIPT ISSUE

$
0
0

I have an issue with my fastload script. It loads the data into my target table however the characters like '*' in my fixed width flat file  need to be converted to null.
My client  requirement is that multiple '********' in fixed width file need to be treated as null while loading to table . Could anyone help on this.
Thanks .Appreciate your answers.

Forums: 

Handle records in Mload ET table in TEXT mode - A column or character expression is larger than the max size

$
0
0

Greetings experts,
I have fastexported a table in text mode and loaded the data in to target table using Mload in text mode (I am using 13.00 Demo version on windows 7).  
source/target table structure:

      L_ORDERKEY INTEGER,

      L_PARTKEY INTEGER,

      L_QUANTITY DECIMAL(15,2),

      L_LINESTATUS CHAR(1) CHARACTER SET LATIN NOT CASESPECIFIC

PRIMARY INDEX ( L_ORDERKEY )
Now I have manually edited some records in the fastexported file to have numeric overflow thereby some records end up in ET table.  I was trying to handle the records in ET table.
Following is the BTEQ script that I tried to use to export the records in REPORT mode from ET table which is failing.

.logon localtd/tduser,tduser;
.set format on;

.export report file="G:\Users\cheeli\Desktop\bteq_op\et_itemppi_text.txt";

select hostdata from samples.et_itemppi_wodate;

.export reset;

 
Error message is:

select hostdata from samples.et_itemppi_wodate;

$

 *** Failure 3798 A column or character expression is larger than the max size.

                Statement# 1, Info =0 

 *** Total elapsed time was 1 second.

 

 

However, when I tried the same with fast export it worked and I successfully loaded the records into the target table.

 

Fastexport script:

.logtable LT_itemppi_;
.logon localtd/tduser,tduser;
.begin export sessions 12;
.export outfile "G:\Users\cheeli\Desktop\fexp_out\et_fexp_itemppi_text.txt" format text mode record;
select hostdata from samples.et_itemppi_wodate;
.end export;
 

Mload script:

.LOGTABLE SAMPLES.ML_ITEMPPI_wodate;
.logon localtd/tduser,tduser;

.begin import mload tables samples.itemppi_wodate
checkpoint 70
errlimit 3;

.LAYOUT DATA_LAYOUT;
.filler abc * char(2);
.field L_ORDERKEY * char(12);
.filler l_partkey_filler * char(7);
.field L_PARTKEY * char(5); 
.field L_QUANTITY * char(20); 
.field L_LINESTATUS * CHAR(2); 

.dml label insert_itemppi;
insert into samples.itemppi_wodate values (:L_ORDERKEY, :L_PARTKEY, :L_QUANTITY, :L_LINESTATUS);

.import infile "G:\Users\cheeli\Desktop\fexp_out\et_fexp_itemppi_text.txt" 
format text
layout data_layout 
apply insert_itemppi;
.end mload;

.logoff;

Can you please let me know how to export the records from BTEQ!
I have tried to cast the hostdata to char(1000) and it has failed as cast is not allowed on VARBYTE.

 

Forums: 

Teradata Wallet

$
0
0

Hello!
I'm trying to use teradata wallet for encrypt user password on some script.
I have insert an item named PWD_SSF_DEV and then I've crete the script deptquery.txt.
This is the source code:
.logon 11.51.71.141/US_FND_SSF_DEV,$tdwallet(PWD_SSF_DEV)
.logoff
.exit

If I run BTEQ < deptquery.txt , work correctly, but I would like to use the wallet also in a script bash like this:
 

#!/bin/bash

[...]

bteq  <<EOF

.logon 11.51.71.141/US_FND_SSF_DEV,"$tdwallet(PWD_SSF_DEV)";

[...]

.quit

EOF

 

( where US_FND_SSF_DEV is the user whose password is defined in the wallet item )

 

unfortunately , this script returns this error:

 

.logon 11.51.71.141/US_FND_SSF_DEV,

*** Error: Logon failed!

*** Total elapsed time was 1 second.

Thank you for helping!

Forums: 

BTEQ in Windows Environment

$
0
0

Hello all. First time poster.
What components (i.e. software, drivers) would I need if I want to run BTEQ scripts in a Windows environment?
 
Your feedback is greatly appreciated.

Forums: 

TerradataParallelTransporter-Load error

$
0
0

Error occured during Initiate

loadop.dll) instance(1): INITIATE method failed with status = Not Found

Type: 0

Driver Terminated with status 3

Deleting objects

*** Load Complete ***

 

The above error occurs when we execute loadoperation using terradataparallel transporter.

 

 

We use jni to connect with c++ below is the source which is used to execute load operation.

 

TD_PT_OPERATOR='TD_LOAD'\n"+

"TARGET.ATTRIBUTES.UserName = 'dbc'\n"+

"TARGET.ATTRIBUTES.UserPassword = 'dbc'\n"+

"TARGET.ATTRIBUTES.LogTable = 'TiaraDB.Employeeinformation_log'\n"+

"TARGET.ATTRIBUTES.TargetTable = 'TiaraDB.Employeeinformation'\n"+

"TARGET.ATTRIBUTES.TdpId = 'localtd'\n"+

"DML.STATEMENT='INSERT INTO TiaraDB.Employeeinformation (empid,empname,empdesig,empsalary,empmail,empaddress,empzip,empcity,empstate,empcountry) VALUES (:empid,:empname,:empdesig,:empsalary,:empmail,:empaddress,:empzip,:empcity,:empstate,:empcountry);'\n"+

"SOURCE.SCHEMA='EmpId VARCHAR(5),EmpName VARCHAR(30),EmpDesig VARCHAR(10),EmpSalary VARCHAR(50),empmail VARCHAR(50),EmpAddress VARCHAR(50),EmpZip VARCHAR(50),EmpCity VARCHAR(50),EmpState VARCHAR(50),EmpCountry VARCHAR(50))'\n"+

"SOURCE.ATTRIBUTES.FileName = 'TDExportData.txt'\n"+

"SOURCE.ATTRIBUTES.Format = 'FORMATTED'\n"+

"SOURCE.ATTRIBUTES.OpenMode = 'Read'\n"+

"SOURCE.ATTRIBUTES.DirectoryPath = 'C:\\eclipse'\n"+

"SOURCE.ATTRIBUTES.IndicatorMode = 'N'\n"+

"SOURCE.ATTRIBUTES.TextDelimiter = ','";

 

Forums: 

SQL ASSISTANT

$
0
0

Hello,
Can we run TERADATA Utilities such as BTEQ, MLOAD through SQL Assistant?
-Suresh.
 

Forums: 

Data mismatch while migrating data from SAS to Teradata using TPT

$
0
0

Hi,
I have been trying to load a 200GB dataset from SAS onto Teradata using TPT multiload option. The load was successful but i noted a record count mismatch in SAS and Teradata table. What could be the reasons for this? There was no records captured in UV & ET tables. Also suggest any apporach to identifiy these mismatch records. 

Forums: 

portable Sql Assitant?

$
0
0

Hello.
Does anybody knows if there is a portable version of Sql Assistant?
Like a folder I can copy to my pc and just run the .exe?
 
thanks

Forums: 

“Warning: RDBMS CRASHED OR SESSIONS RESET. RECOVERY IN PROGRESS” running bteq on AIX

$
0
0

After installing Teradata version 14.0 new on AIX v6.1 test box, I login using “bteq” command and then get the error “Warning: RDBMS CRASHED OR SESSIONS RESET. RECOVERY IN PROGRESS” and then just hangs until I can only kick out.  Then receive  "*** Warning: Exiting because of three BREAKs!   *** Exiting BTEQ...      *** RC (return code) = 2".  Note: /etc/services file only lists “tdmst  1025/tcp” and “tpa” does not exist in /etc/init.d file.  Components installed successfully.   Thank you.

Forums: 

End of Record marker with TPT

$
0
0

I have a CSV file that I'm attempting to load via TPT.  I have created the CSV file in Excel.  When I try to load the file with the appropriate number of delimiters, I am getting a BUFFERMAXSIZE error.  When I add another delimiter to the end of each record, the file loads just fine.  This isn't a huge issue, but I'm just confused and would like to understand more about end of record markers.  My schema definition is below as well as my Dataconnector properties

DEFINE SCHEMA MVT_INPUT
	Description 'MOVEMENTS INPUT DEFINITION'
	(
		tk_num VARCHAR(5),
		tk_descr VARCHAR(30),
		rule VARCHAR(2),
		TagName1 VARCHAR(255),
		TagName2 VARCHAR(255)
	);

 

ATTRIBUTES
		(
			FileName = 'GravLoad.csv', 
			Format = 'DELIMITED', 
			OpenMode = 'Read', 
			DirectoryPath = 'D:\Tony\csv', 
			IndicatorMode = 'N', 
			TextDelimiter = ','
		)

 

Forums: 

Issue while loading space delimited flat file using fastload on TPT

$
0
0

I am trying to loada flat file that is delimited with space having 8 columns.  While loading this data using TPT, I am getting erorr "Delimited Data Parsing error: Column length overflow(s) in row 1". Even when i try to convert the space delimiter to pipe, i am getting number of columns mismatch in flat file.
Any sugesstions as to why flat file with SPACE delimiter is not working with TPT. Also there are missing columns in this flat file but we have used ACCEPTMISSINGCOLUMNS option. Is this correct.
Sample data (which has multiple spaces between columns) is given below along with the log and the script that is run.
Data:
====
d1      d2              d3              d4      d5      d6      d7      d8
1001    09/01/1995      09/03/1997      2       112     14233   1001    0    
1001    05/02/2008      12/02/2008      2       447     14189   9001    Odkp27uEjCEaByuZLQgXw6kbb88bmPwUfGAEnaH0mg0=  
1001    09/01/1995      09/03/1997      2       112     84      1001    kVbMu8z5RMJprze4ob1AX/IU6X3lDT6oIMbPgJyPbt0=  
1001    18/07/2003      18/07/2003      2       325     35      9001    0  
1001    05/02/2008      12/02/2008      2       447     14172   9001    0  
1001    05/02/2008      12/02/2008      2       447     8687    9001    0  
1001    05/02/2008      12/02/2008      2       447     14173   9001    0  
1001    03/06/2010      03/06/2010      2       551     15987   219001  MJTjTPGHOlOGg1LgSqUMXrB/4yfz4EGHeUfsrzGj780=  
1001    20/07/2010      20/07/2010      2       561     2632    0       27YEsg3G78SuGTcI4zcPCiIHK8wNsWcTrtU6glepyCA=  
1001    18/12/2001      19/12/2001      2       271     13607   9001    0   
1001    18/12/2001      19/12/2001      2       271     7578    9001    0   
1001    18/12/2001      19/12/2001      2       271     77      9001    0   
1001    16/05/2001      17/05/2001      2       253     63      9001    Lc8ybk+g5Iu9iF9eyCF0hL+9E8AlC3wTTOvRPzOQYf4=   
1001    16/10/2003      13/11/2003      2       349     13745   9001    1lKpwbahAH9QZW7hSqCaBqrIhSGoKF4TDYCWwNf/ZbE=   
1001    16/10/2003      28/10/2003      2       346     7926    9001    OXPTHvQ39elodq8CjRdpj5iqH40bhSLscBZjGicPIPU=   
1001    11/02/2008      11/02/2008      2       444     7326    0       2R/fiQqH9AlFd2tpGWii+peuoh9x2DywdGn+wG1QpVc=   
1001    03/11/1998      03/12/1998      2       173     1004    9001    0    
1001    26/05/2010      26/05/2010      2       548     7326    0       MJTjTPGHOlOGg1LgSqUMXr0rffpBEfqVFyScrFH5Gzc=  
1001    20/01/2010      21/01/2010      2       523     10289   219001  0   
1001    16/10/2003      20/10/2003      2       339     8491    9001    0   
1001    16/10/2003      23/10/2003      2       344     8491    9001    0NaMPHiuGhfoZy7cv1QNxqHxIsPjhQRUny2OZ69CMqA=
1001    16/05/2001      17/05/2001      2       252     66      9001    0   
1001    16/05/2001      17/05/2001      2       252     23      9001    0   
1001    18/12/2001      19/12/2001      2       271     13721   9001    0   
1001    16/05/2001      17/05/2001      2       252     24      9001    0   
1001    16/05/2001      17/05/2001      2       252     32      9001    0   
1001    20/01/2010      21/01/2010      2       522     64      219001  lfYay0a9dVu2wcF1OHpuBguP5n57iCep6GxIliM1DV0=   
1001    29/05/2009      29/05/2009      2       485     14563   219001  0   
 
 
 
TPT Log:
==========
Teradata Parallel Transporter Version 14.00.00.10
Job log: /opt/teradata/client/14.00/tbuild/logs/plk1-40.out
Job id is plk1-40, running on us111
Teradata Parallel Transporter DataConnector_C2: TPT19006 Version 14.00.00.10
DataConnector_C2 Instance 1 directing private log report to 'STG_DB.TPT_TEST'.
DataConnector_C2: TPT19008 DataConnector Producer operator Instances: 1
Teradata Parallel Transporter SQL Inserter Operator Version 14.00.00.10
Insert_TPT_TEST2: private log specified: STG_DB.TEST_Space
DataConnector_C2: TPT19003 ECI operator ID: DataConnector_C2-382
Insert_TPT_TEST2: connecting sessions
DataConnector_C2: TPT19222 Operator instance 1 processing file '/COPY/RAW_DATA/test_10000.txtab'.
Insert_TPT_TEST2: Total Rows Sent To RDBMS:      0
Insert_TPT_TEST2: Total Rows Applied:            0
Insert_TPT_TEST2: disconnecting sessions
Insert_TPT_TEST2: Total processor time used = '0.16 Second(s)'
Insert_TPT_TEST2: Start : Wed Mar 19 12:04:37 2014
Insert_TPT_TEST2: End   : Wed Mar 19 12:04:41 2014
Job step Load_TPT_TEST2 terminated (status 12)
Job plk1 terminated (status 12)
DataConnector_C2: TPT19350 I/O error on file '/COPY/RAW_DATA/test_10000.txtab'.
DataConnector_C2: TPT19003 Delimited Data Parsing error: Column length overflow(s) in row 1
DataConnector_C2: TPT19003 TPT Exit code set to 12.
DataConnector_C2: TPT19221 Total files processed: 0.

TPT Script:
===========

DEFINE JOB Load_TPT_TEST2
  DESCRIPTION 'Load a Teradata table from a space delimited flat file' (
      DEFINE SCHEMA Schema_TAB2 (
        D1 VARCHAR(20),
        D2 VARCHAR(10),
        D3 VARCHAR(10),
        D4 VARCHAR(3),
        D5 VARCHAR(20),
        D6 VARCHAR(20),
        D7 VARCHAR(20),
        D8 VARCHAR(50)        
);
      DEFINE OPERATOR DataConnector_C2
      TYPE DATACONNECTOR PRODUCER
      SCHEMA Schema_C2
      ATTRIBUTES (
          VARCHAR PrivateLogName     = 'STG_DB.TPT_TEST',
          VARCHAR FileName       = '/COPY/RAW_DATA/test_10000.txtab',
          VARCHAR TraceLevel       = 'All',
          VARCHAR FORMAT         = 'Delimited',
          VARCHAR TextDelimiter      = 'space',
          VARCHAR OpenMode        = 'read',
          VARCHAR AcceptMissingColumns = 'Y'
      );

      DEFINE OPERATOR Insert_TPT_TEST2
      TYPE INSERTER
      SCHEMA *
      ATTRIBUTES (
          VARCHAR PrivateLogName   = 'STG_DB.TEST_Space',
          VARCHAR TdpId           = 'xx.xxx.xxx.xx',
          VARCHAR UserName        = 'USER1',
          VARCHAR UserPassword    = 'USER1',
          VARCHAR TargetTable      = 'STG_DB.TPT_TEST',
          VARCHAR LogTable        = 'STG_DB.TPT_TEST_L',
          VARCHAR ErrorTable1       = 'STG_DB.TPT_TEST_E1',
          VARCHAR ErrorTable2       = 'STG_DB.TPT_TEST_E2',
          VARCHAR WorkTable        = 'STG_DB.TPT_TEST_WT'
      );

      STEP Load_TPT_TEST2 (
          APPLY (
              'INSERT INTO STG_DB.TEST_Space (
                D1,
                D2,
                D3,
                D4,
                D5,
                D6,
                D7,
                D8
                 
              )
              VALUES (
                     :D1,
                     :D2,
                     :D3,
                     :D4,
                     :D5,
                     :D6,
                     :D7,
                     :D8
                     );'
          )
          TO OPERATOR (
              Insert_TPT_TEST2[1]
          )
        SELECT
         D1,
                D2,
                D3,
                D4,
                D5,
                D6,
                D7,
                D8

          FROM OPERATOR (
              DataConnector_C2[1]
          );
      );

   );
 

Forums: 

Error in MLOAD

$
0
0

Hello All,
This the first time i am running MLOAD Script. Please Help me find the error.
Script:
.LOGTABLE DB.logs2;
.LOGON Jugal/jbhatt,jugal;
CREATE MULTISET TABLE DB.Mload_Input ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      Empid INTEGER,
      EmpName VARCHAR(5) CHARACTER SET LATIN CASESPECIFIC)
PRIMARY INDEX ( Empid );
.BEGIN IMPORT MLOAD TABLES DB.Mload_Input;
.LAYOUT S1;
.FIELD EmpId * VARCHAR(10);
.FIELD EmpName * VARCHAR(5);
.DML LABEL L1;
INSERT into DB.Mload_Input values(:EmpId,:EmpName);
.IMPORT INFILE /home/jbhatt/data.txt FORMAT VARTEXT ','
LAYOUT S1
APPLY L1;
.END MLOAD;
.LOGOFF;
Logs:
$ mload<MLOAD.txt
     ========================================================================
     =                                                                      =
     =          MultiLoad Utility    Release MLOD.14.00.00.08               =
     =          Platform LINUX                                              =
     =                                                                      =
     ========================================================================
     =                                                                      =
     =     Copyright 1990-2011 Teradata Corporation. ALL RIGHTS RESERVED.   =
     =                                                                      =
     ========================================================================
**** 09:10:51 UTY2411 Processing start date: FRI MAR 21, 2014
     ========================================================================
     =                                                                      =
     =          Logon/Connection                                            =
     =                                                                      =
     ========================================================================
0001 .LOGTABLE DB.logs2;
0002 .LOGON Jugal/jbhatt,;
**** 09:10:52 UTY8400 Teradata Database Release: 14.00.05.02
**** 09:10:52 UTY8400 Teradata Database Version: 14.00.05.03
**** 09:10:52 UTY8400 Default character set: ASCII
**** 09:10:52 UTY8400 Current RDBMS has interval support
**** 09:10:52 UTY8400 Current RDBMS has UDT support
**** 09:10:52 UTY8400 Current RDBMS has Large Decimal support
**** 09:10:52 UTY8400 Current RDBMS has TASM support
**** 09:10:52 UTY8400 Maximum supported buffer size: 1M
**** 09:10:52 UTY8400 Data Encryption supported by RDBMS server
**** 09:10:52 UTY6211 A successful connect was made to the RDBMS.
**** 09:10:52 UTY6210 Logtable 'DB.logs2' indicates that a restart is
     in progress.
     ========================================================================
     =                                                                      =
     =          Processing Control Statements                               =
     =                                                                      =
     ========================================================================
0003 .BEGIN IMPORT MLOAD TABLES DB.Mload_Input;
     ========================================================================
     =                                                                      =
     =          Processing MultiLoad Statements                             =
     =                                                                      =
     ========================================================================
0004 .LAYOUT S1;
0005 .FIELD EmpId * VARCHAR(10);
0006 .FIELD EmpName * VARCHAR(5);
0007 .DML LABEL L1;
0008 INSERT into DB.Mload_Input values(:EmpId,:EmpName);
0009 .IMPORT INFILE /home/jbhatt/data.txt FORMAT VARTEXT ','
     LAYOUT S1
     APPLY L1;
0010 .END MLOAD;
     ========================================================================
     =                                                                      =
     =          MultiLoad Initial Phase                                     =
     =                                                                      =
     ========================================================================
**** 09:10:52 UTY0829 Options in effect for this MultiLoad import task:
     .       Sessions:    One session per available amp.
     .       Checkpoint:  15 minute(s).
     .       Tenacity:    4 hour limit to successfully connect load sessions.
     .       Errlimit:    No limit in effect.
     .       AmpCheck:    In effect for apply phase transitions.
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
     Select NULL from DB.logs2 where (LogType = 125) and (Seq = 1)
     and (MloadSeq = 0);
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
     Select NULL from DB.logs2 where (LogType = 120) and (Seq = 1);
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
     SET QUERY_BAND='UTILITYNAME=MULTLOAD;' UPDATE FOR SESSION;
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
     CHECK WORKLOAD FOR BEGIN MLOAD DB.Mload_Input;
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
     CHECK WORKLOAD END;
**** 09:10:52 UTY0844 Session count 16 returned by the DBS overrides
     user-requested session count.
**** 09:10:56 UTY0815 MLOAD session(s) connected: 16.
**** 09:10:56 UTY0817 MultiLoad submitting the following request:
     BEGIN MLOAD DB.Mload_Input WITH INTERVAL;
**** 09:10:56 UTY0817 MultiLoad submitting the following request:
     Select NULL from DB.logs2 where (LogType = 130) and (Seq = 1)
     and (MloadSeq = 20);
**** 09:10:56 UTY0832 This MultiLoad import task cannot proceed: an unexpected
     MultiLoad phase, data acquisition, was reported by the RDBMS.

     ========================================================================
     =                                                                      =
     =          Logoff/Disconnect                                           =
     =                                                                      =
     ========================================================================
**** 09:10:58 UTY6212 A successful disconnect was made from the RDBMS.
**** 09:10:58 UTY2410 Total processor time used = '1.84 Seconds'
     .       Start : 09:10:51 - FRI MAR 21, 2014
     .       End   : 09:10:58 - FRI MAR 21, 2014
     .       Highest return code encountered = '12'.
$

 

Tags: 
Forums: 

Mload error while using Update/Insert for null PI fields

$
0
0

Hi All,
We are trying to use update else insert loading startegy for a teradata table. The table has PI defined on fields which is nullable. The issue is whenever we are trying to update a record having a null value in any of the PI fields, it is getting rejcted into the UV table (unable to indentify the update and thus trying to do an insert). 
So solve the issue, we have created two DML labels, one for null records and the other for not null. However when we are trying to use two apply statements (with is null and is not null), we are facing error.
-----------------------------------------------------
.DML Label tagDML_Null

Do insert for missing update rows;

UPDATE :CF.DatabaseName.TAB1

SET 

R_ID                                   = :R_ID   , 

WHERE

STAT IS NULL;

 

INSERT INTO :CF.DatabaseName.TAB1 ( 

R_ID                                  , 

STAT) VALUES(

:R_ID,

NULL);

 

.DML Label tagDML

Do insert for missing update rows;

UPDATE :CF.DatabaseName.TAB1

SET 

R_ID                                   = :R_ID   , 

WHERE

STAT                                   = :STAT;

 

INSERT INTO :CF.DatabaseName.TAB1 ( 

R_ID                                  , 

STAT) VALUES(

:R_ID,

:STAT);

 

 

Import Infile ':CF.ImportFileName'

 Layout InputFileLayout

 Format Unformat 

 

 Apply tagDML WHERE STAT IS NULL

 Apply tagDML_Null WHERE STAT IS NOT NULL

;

 

 

 

Can anyone please help me here what can be done to fix the issue?

 

Thanks,

moloy

Forums: 

Teradata TPT Error :: TPT10508: RDBMS error 3812: The positional assignment list has too few values.

$
0
0

Hi,
When i ran the TPT script i got below error.
LOAD_OPERATOR: connecting sessions
EXPORT_OPERATOR: connecting sessions
LOAD_OPERATOR: preparing target table
LOAD_OPERATOR: entering Acquisition Phase
LOAD_OPERATOR: TPT10508: RDBMS error 3812: The positional assignment list has too few values.
Am using Source as Teradata View
Target : Teradata table.
( show sel * from  table tablename-- 179 rows( view)
  Help table tablename -- 178 rows(table))
when i ran the above queries on source tables i got like this.
Thanks
Anil
 
 
 

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>