Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

Can the wallet password resupply automatically using script after server reboot?

$
0
0

As we need to resupply the wallet password after every server reboot to access the password-protected item values.

Can we automate this process by supplying the passowrd using script?

Tags: 
Forums: 

TTU for Mac OS

$
0
0

Hi, Can you please tell me if TTU is available for MAC ? My requirement is to run BTEQ from terminal.app. 
Thanks in advance for your help.

Forums: 

BTEQ IMPORT SKIP Command

$
0
0

Hello ,
I need help on using BTEQ SKIP Command while Importing data from File to Table, with the help of few posts in this forum i understand that skip command should be used like below

 

.Import Vartext',' file=C:\Users\Sridhar\Desktop\teradata\BTEQ Scripts Outputs\sridhar4.txt, SKIP=10;

 

when i tried above it is thoughing error on File Open becuase No such file or no Directory exits and when i tried with out SKip command it is working.

 

Below are My Script:-

 

.logon 127.0.0.1/tduser,tduser

.Import Vartext',' file=C:\Users\Sridhar\Desktop\teradata\BTEQ Scripts Outputs\sridhar4.txt, SKIP=10;

 

 

.quiet on

 

database samples;

 

delete from emp_details_multiset;

 

.Repeat *

 

Using 

 

 in_no (varchar(10))

,in_name (varchar(50))

,in_sal (varchar(10))

,in_dept (varchar(10))

,in_loc (varchar(20)) 

 

Insert into emp_details_multiset (emp_id,emp_name,salary,dept,loc)

values(:in_no,:in_name,:in_sal,:in_dept,:in_loc);

 

select * from emp_details_multiset;

 

.quiet off

.quit

.logoff

 

Thanks,

Sridhar.

 
 

Forums: 

DUPLICATE RECORDS USAGE

$
0
0

Hi All,
I am confused with the usage of multiset table in real time.
what exactly i mean is 
where do we require the duplicate rows and what is the use of complete duplicate row in real time.
why we store complete duplicate records and uses of it???
In which environment the complete duplicate record is useful??
 
please make me clear with this concept with any real time scenario usage of complete duplicate records.
Thanks All.
RATHOD.

Forums: 

TPT API - Errors when building getbuffer sample

$
0
0

Hi Folks,
The Teradata TPT API could be very useful in our department to tailor the operation of TPT.  I tried building the getbuffer 64-bit sample program with Visual Studio 2010 and I'm getting 23 errors,  all similar to the following.   I know my way around C++, but dont understand anything about dllimport/dllexport.
    1>GetBuffer.obj : error LNK2001: unresolved external symbol "__declspec(dllimport) public: void __thiscall ncr::teradata::client::API::DMLGroup::AddStatement(char *)" (__imp_?AddStatement@DMLGroup@API@client@teradata@ncr@@QAEXPAD@Z)
Also tried building the getbuffer 32-bit sample and getting the following error.  Searching on the net, I tried setting Incremental Linking to "No", but still have the problem. 
error LNK1123: failure during conversion to COFF: file invalid or corrupt  O:\TPT_API\getbuffer\LINK
Any help with solving either or both of these problems would be greatly appreciated.
Thanks!
-Greg
 

Forums: 

TTU 15.10 Windows installation paths

$
0
0

Why was the default installation path for SQL Assistant change to not Teradata SQL Assistant in the path?  Why did the sample ODBC program also get moved to be located in the folder with SQL Assistant?

Forums: 

FASTLOAD Date format issue

$
0
0

Hello Team,
could you please help me on below issue with date formate at Define statement of FL

.LOGON 127.0.0.1/TDUSER,TDUSER

 

DROP TABLE SAMPLES.EMP_PERS;

 

 

CREATE TABLE SAMPLES.EMP_PERS

(EMP_ID INTEGER NOT NULL

,EMP_NAME VARCHAR(50)

,SALARY INTEGER

,DEPT_ID VARCHAR(20)

,LOC  VARCHAR(20)

,DOB DATE

,SSN INTEGER

)

 UNIQUE PRIMARY INDEX(EMP_ID) ;

 

BEGIN LOADING 

SAMPLES.EMP_PERS

ERRORFILEs SAMPLES.EMP_ERR1,SAMPLES.EMP_ERR2 ;

 

SET RECORD VARTEXT "~" ;

 

DEFINE 

EMP_ID (VARCHAR(9))

,EMP_NAME (VARCHAR(50))

,SALARY (VARCHAR(9))

,DEPT_ID (VARCHAR(20))

,LOC (VARCHAR(50))

,DOB (varchar, FORMAT 'MM/DD/YYYY')

,SSN (VARCHAR(20))

 

FILE= C:\Users\Sridhar\Desktop\teradata\BTEQ Scripts Outputs\FL1.txt;

 

INSERT INTO SAMPLES.EMP_PERS

(:EMP_ID

,:EMP_NAME

,:SALARY

,:DEPT_ID

,:LOC

,:DOB 

,:SSN

 

);

 

END LOADING;

LOGOFF;
 
Error Message :- Defination syntax error for DOB field .
 
 

Forums: 

TPT Job was terminated even if the "Highest return code encountered = '0'"

$
0
0

Hi, I worked in TD Managed Services and I am not good in TPT. One of the users reported that his TPT job was terminated but he can't find anything from the logs that indicates the cause of the job termination. Here is the TPT job log messages.... (original database name and table name are masked).

Gather_Logs.000||start|Start|

teradata-load|start||

teradata-load|info|Starting component|

teradata-load|info|About to truncate table before load|

teradata-load|info|DELETE FROM xxxxx_xxx.x_xxxxxxx_xxxxxx ALL|

teradata-load|info|Executing program 'tbuild' with the following arguments:|

teradata-load|info|    tbuild -f /tmp/*****_*****/util-024720-443763-11916.cmds xxx_xx_xxxx-0511-0247am-tRWNdDuGV|

teradata-load|info|  and with the following Parallel Transporter commands:|

teradata-load|info|--------------------|

teradata-load|info|USING CHARACTER SET ASCII|

teradata-load|info|DEFINE JOB LOAD_USER_DATA|

teradata-load|info|  (|

teradata-load|info|    DEFINE OPERATOR LOADOP()|

teradata-load|info|    DESCRIPTION 'Parallel Transporter Inserter operator'|

teradata-load|info|    TYPE INSERTER|

teradata-load|info|    SCHEMA *|

teradata-load|info|    ALLOW PARALLEL MULTIPHASE|

teradata-load|info|    ATTRIBUTES|

teradata-load|info|      (|

teradata-load|info|        VARCHAR PrivateLogName = 'tptinsert',|

teradata-load|info|        VARCHAR TdpId = 'xxxxx',|

teradata-load|info|        VARCHAR UserName = 'xxx_xx_xxxx',|

teradata-load|info|        VARCHAR UserPassword = <*****>,|

teradata-load|info|        VARCHAR WorkingDatabase = 'xx_xxxx',|

teradata-load|info|        VARCHAR RemoveBOMFromFile = 'No',|

teradata-load|info|        VARCHAR DeleteLobDataFiles = 'Yes',|

teradata-load|info|        VARCHAR DateForm = 'ANSIDATE',|

teradata-load|info|        INTEGER MinSessions = 1,|

teradata-load|info|        INTEGER MaxSessions = 20|

teradata-load|info|      );|

teradata-load|info||

teradata-load|info|    DEFINE SCHEMA DATA_LAYOUT|

teradata-load|info|    DESCRIPTION 'datatypes of incoming data'|

teradata-load|info|      (|

teradata-load|info|        projectid DECIMAL(10),|

teradata-load|info|        userid CHAR(12),|

teradata-load|info|        type_td VARCHAR(250),|

teradata-load|info|        deleted CHAR(1),|

teradata-load|info|        instime CHAR(19),|

teradata-load|info|        updtime CHAR(19),|

teradata-load|info|        projectguid VARCHAR(40),|

teradata-load|info|        summaryxml_blob BLOB(2097088000) AS DEFERRED BY NAME,|

teradata-load|info|        summaryxml_clob CLOB(2097088000) AS DEFERRED BY NAME,|

teradata-load|info|        subtype VARCHAR(12),|

teradata-load|info|        detailxml BLOB(2097088000) AS DEFERRED BY NAME,|

teradata-load|info|        detailxml_clob CLOB(2097088000) AS DEFERRED BY NAME|

teradata-load|info||

teradata-load|info|      );|

teradata-load|info||

teradata-load|info|    DEFINE OPERATOR READ_DATA ()|

teradata-load|info|    DESCRIPTION 'Read incoming data records'|

teradata-load|info|    TYPE DATACONNECTOR PRODUCER|

teradata-load|info|    SCHEMA DATA_LAYOUT|

teradata-load|info|    ALLOW PARALLEL MULTIPHASE|

teradata-load|info|    ATTRIBUTES|

teradata-load|info|      (|

teradata-load|info|        VARCHAR PrivateLogName,|

teradata-load|info|        VARCHAR FileName,|

teradata-load|info|        VARCHAR AccessModuleName = 'xxxxxxxxxxxxxx_xxxxxx.xx',|

teradata-load|info|        VARCHAR AccessModuleInitStr,|

teradata-load|info|        INTEGER Timeout = 10,|

teradata-load|info|        VARCHAR IndicatorMode = 'Y',|

teradata-load|info|        VARCHAR OpenMode = 'Read',|

teradata-load|info|        VARCHAR Format = 'Binary'|

teradata-load|info|      );|

teradata-load|info||

teradata-load|info|   STEP load_data_from_pipes|

teradata-load|info|   (|

teradata-load|info|    APPLY|

teradata-load|info|      'INSERT INTO xxxxx_xxx.x_xxxxxxx_xxxxxx (PROJECTID,USERID,TYPE_TD,DELETED,INSTIME,UPDTIME,PROJECTGUID,SUMMARYXML_BLOB,SUMMARYXML_CLOB,SUBTYPE,DETAILXML,DETAILXML_CLOB) VALUES (:projectid,:userid,:type_td,:deleted,:instime,:updtime,:projectguid,:summaryxml_blob,:summaryxml_clob,:subtype,:detailxml,:detailxml_clob);'|

teradata-load|info|    TO OPERATOR ( LOADOP () [20])|

teradata-load|info|    SELECT * FROM OPERATOR|

teradata-load|info|      (|

teradata-load|info|        READ_DATA() [1]|

teradata-load|info|        ATTR|

teradata-load|info|          (|

teradata-load|info|            PrivateLogName = 'load_dclog0',|

teradata-load|info|            AccessModuleInitStr =  'fallback_directory=/xx/xxxx/xxx/xxxxxx/xxxxxxx/xxxx/. -pno=0',|

teradata-load|info|            FileName = 'Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x.'|

teradata-load|info|          )|

teradata-load|info|      );|

teradata-load|info|   );|

teradata-load|info||

teradata-load|info|  );|

teradata-load|info|--------------------|

teradata-load|stdout|Teradata Parallel Transporter Version 15.00.00.03 64-Bit|

teradata-load|stdout|Job log: /opt/teradata/client/15.00/tbuild/logs/xxx_xx_xxxx-0511-0247am-tRWNdDuGV-60.out|

teradata-load|stdout|Job id is xxx_xx_xxxx-0511-0247am-tRWNdDuGV-60, running on xxxxxxxx|

teradata-load|stdout|Teradata Parallel Transporter SQL Inserter Operator Version 15.00.00.03|

teradata-load|stdout|LOADOP: private log specified: tptinsert|

teradata-load|stdout|Teradata Parallel Transporter DataConnector Version 15.00.00.03|

teradata-load|stdout|READ_DATA[1]: Instance 1 directing private log report to 'load_dclog0-1'.|

teradata-load|stdout|READ_DATA[1]: DataConnector Producer operator Instances: 1|

teradata-load|stdout|Ab Initio AXSMOD Version 01.04.00.00 20:13:34 Nov  4 2014|

teradata-load|stdout|Ab Initio AXSMOD initializing at Mon May 11 02:48:06 2015|

teradata-load|stdout||

teradata-load|stdout|Ab Initio AXSMOD pmdc Version 'Common 14.00.00.07', packing 'pack (push, 1)'|

teradata-load|stdout|Ab Initio AXSMOD pmdd Version 'Common 14.00.00.01', packing 'pack (push, 1)'|

teradata-load|stdout|AB_AXSMOD log trace level  0|

teradata-load|stdout|AB_AXSMOD Running on behalf of graph vertex Output_Table__table_.perm-cont.000|

teradata-load|stdout|READ_DATA[1]: ECI operator ID: 'READ_DATA-30897'|

teradata-load|stdout|LOADOP: connecting sessions|

teradata-load|stdout|AB_AXSMOD read data timeout enabled|

teradata-load|stdout|AB_AXSMOD Infile: /tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x|

teradata-load|stdout|READ_DATA[1]: Operator instance 1 processing file 'Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x.'.|

teradata-load|stdout|LOADOP: sending data|

teradata-load|stdout|AB_AXSMOD Got EOF Blip... Normal end to loading|

teradata-load|stdout|LOADOP: finished sending data|

teradata-load|stdout|Ab Initio AXSMOD shutdown at Mon May 11 03:26:38 2015|

teradata-load|stdout||

teradata-load|stdout|AB_AXSMOD is shutting down|

teradata-load|stdout|READ_DATA[1]: Total files processed: 1.|

teradata-load|stdout|LOADOP: Total Rows Sent To RDBMS:      142878|

teradata-load|stdout|LOADOP: Total Rows Applied:            142878|

teradata-load|stdout|LOADOP: disconnecting sessions|

teradata-load|stdout|LOADOP: Total processor time used = '102.96 Second(s)'|

teradata-load|stdout|LOADOP: Start : Mon May 11 02:48:06 2015|

teradata-load|stdout|LOADOP: End   : Mon May 11 03:26:38 2015|

teradata-load|stdout|Job step load_data_from_pipes completed successfully|

teradata-load|stdout|Job xxx_xx_xxxx-0511-0247am-tRWNdDuGV completed successfully|

teradata-load|stdout|Job start: Mon May 11 02:47:42 2015|

teradata-load|stdout|Job end:   Mon May 11 03:26:39 2015|

teradata-load|info||

teradata-load|info||

teradata-load|info|Getting Data Connector log info for stream 1: load_dclog0-1|

teradata-load|tlog|     =                  TraceFunction: 'NO (defaulted)' (=0)                  =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                     TERADATA PARALLEL TRANSPORTER                      =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =              DATACONNECTOR OPERATOR VERSION  15.00.00.03               =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =           DataConnector UTILITY LIBRARY VERSION 15.00.00.35            =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =    COPYRIGHT 2001-2014, Teradata Corporation.  ALL RIGHTS RESERVED.    =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|      |

teradata-load|tlog|     Operator name: 'READ_DATA' instance 1 of 1 [Producer]|

teradata-load|tlog|      |

teradata-load|tlog|**** 02:48:06 Processing starting at: Mon May 11 02:48:06 2015|

teradata-load|tlog||

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                    Operator module static specifics                    =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                 Compiled for platform: '64-bit LINUX'                 =|

teradata-load|tlog|     =          Operator module name:'dtacop', version:'15.00.00.03'         =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     = pmdcomt_HeaderVersion: 'Common 15.00.00.04' - packing 'pack (push, 1)' =|

teradata-load|tlog|     = pmddamt_HeaderVersion: 'Common 15.00.00.01' - packing 'pack (push, 1)' =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|      |

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                   > General attribute Definitions <                    =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                             TraceLevel: ''                            =|

teradata-load|tlog|     =                   EndianFlip: 'NO (defaulted)' (=0)                    =|

teradata-load|tlog|     =                       IndicatorMode: 'YES' (=1)                        =|

teradata-load|tlog|     =                  NullColumns: 'YES (defaulted)' (=1)                   =|

teradata-load|tlog|     =                       SYSTEM_CharSetId: 'ASCII'                       =|

teradata-load|tlog|     =                          TimeOut: 10 seconds                           =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|      |

teradata-load|tlog|     LITTLE ENDIAN platform|

teradata-load|tlog|     WARNING!  NotifyMethod: 'None (default)'|

teradata-load|tlog|     Operator 'dtacop' main source version:'15.00.00.30'|

teradata-load|tlog|     DirInfo global variable name: 'DirInfo'|

teradata-load|tlog|     FileNames global variable name: 'FileNames'|

teradata-load|tlog|     DC_PREAD_SM_TOKENS global variable name: 'DC_PREAD_SM_TOKENS'|

teradata-load|tlog|      |

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                   > Operator attribute Definitions <                   =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|      |

teradata-load|tlog|     DirectoryPath defaulting to: '/xx/xxxx/xxx/xxxxxx/xxxxxxx/xxxx/.WORK-serial/143112ac-55507aa3-28a3'|

teradata-load|tlog|     FileList: 'NO (defaulted)' (=0)|

teradata-load|tlog|     MultipleReaders: 'NO (defaulted)' (=0)|

teradata-load|tlog|     RecordsPerBuffer: (use default calculation per schema)|

teradata-load|tlog|     EnableScan: 'Yes (defaulted)'|

teradata-load|tlog|     Initializing with CharSet = 'ASCII'.|

teradata-load|tlog|     Alphabetic CSName=ASCII|

teradata-load|tlog|     Established character set ASCII|

teradata-load|tlog|     Single-byte character set in use|

teradata-load|tlog|     AccessModuleName: 'xxxxxxxxxxxxxx_xxxxxx.so'|

teradata-load|tlog|     WARNING!  Access module will not be called for checkpoint support.|

teradata-load|tlog|     AccessModuleInitStr: 'fallback_directory=/xx/xxxx/xxx/xxxxxx/xxxxxxx/xxxx/. -pno=0'|

teradata-load|tlog|      |

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                       > Module Identification <                        =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|      |

teradata-load|tlog|     DataConnector operator for Linux release 2.6.32-279.el6.x86_64 on xxxxxxxx|

teradata-load|tlog|     TDICU................................... 15.00.00.00|

teradata-load|tlog|     PXICU................................... 15.00.00.03|

teradata-load|tlog|     Loading AM:xxxxxxxxxxxxxx_xxxxxx.so|

teradata-load|tlog|     PMPROCS................................. 15.00.00.12|

teradata-load|tlog|     PMRWFMT................................. 15.00.00.02|

teradata-load|tlog|     PMHADOOP................................ 15.00.00.03|

teradata-load|tlog|     PMTRCE.................................. 13.00.00.02|

teradata-load|tlog|     PMMM.................................... 15.00.00.01|

teradata-load|tlog|     DCUDDI.................................. 15.00.00.20|

teradata-load|tlog|     PMHEXDMP................................ 14.10.00.01|

teradata-load|tlog|     PMHDFSDSK............................... 15.00.00.01|

teradata-load|tlog|     ab_axsmod............................... 01.04.00.00|

teradata-load|tlog|      |

teradata-load|tlog|     Sending to access module: attribute 'TBR_OP_HANDLE'=x'c4b670'|

teradata-load|tlog|     Sending to access module: attribute 'CHARSET_NAME'='ASCII'|

teradata-load|tlog|     Sending to access module: attribute 'PRESERVE_RESTART_INFO'='YES'|

teradata-load|tlog|     Sending to access module: attribute 'TIMEOUT_SECONDS'='10'|

teradata-load|tlog|     >> Enter DC_DataConFileInfo|

teradata-load|tlog|     Job Type=0|

teradata-load|tlog|     UseGeneralUDDIcase: 'NO (defaulted)' (=0)|

teradata-load|tlog|     WriteBOM: 'NO (defaulted)' (=0)|

teradata-load|tlog|     AcceptExcessColumns: 'NO (defaulted)' (=0)|

teradata-load|tlog|     AcceptMissingColumns: 'NO (defaulted)' (=0)|

teradata-load|tlog|     TruncateColumnData: 'NO (defaulted)' (=0)|

teradata-load|tlog|     TruncateColumns: 'NO (defaulted)' (=0)|

teradata-load|tlog|     TruncateLongCols: 'NO (defaulted)' (=0)|

teradata-load|tlog|     WARNING!  RecordErrorFilePrefix attribute not specified, there is no default|

teradata-load|tlog|     RecordErrorVerbosity: OFF (default) (=0)|

teradata-load|tlog|     FileName: 'Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x.'|

teradata-load|tlog|     OpenMode: 'READ' (1)|

teradata-load|tlog|     Format: 'BINARY' (2)|

teradata-load|tlog|     IOBufferSize: 131072 (default)|

teradata-load|tlog|      |

teradata-load|tlog|     Full File Path: Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x.|

teradata-load|tlog|     Data Type              Ind  Length  Offset M|

teradata-load|tlog|              DECIMAL (  4)   1       8       0 N|

teradata-load|tlog|                 CHAR (  5)   1      12       8 N|

teradata-load|tlog|              VARCHAR (  7)   1     250      20 N|

teradata-load|tlog|                 CHAR (  5)   1       1     270 N|

teradata-load|tlog|                 CHAR (  5)   1      19     271 N|

teradata-load|tlog|                 CHAR (  5)   1      19     290 N|

teradata-load|tlog|              VARCHAR (  7)   1      40     309 N|

teradata-load|tlog|         BLOB BY NAME ( 41)   1    1024     349 N|

teradata-load|tlog|         CLOB BY NAME ( 42)   1    1024    1373 N|

teradata-load|tlog|              VARCHAR (  7)   1      12    2397 N|

teradata-load|tlog|         BLOB BY NAME ( 41)   1    1024    2409 N|

teradata-load|tlog|         CLOB BY NAME ( 42)   1    1024    3433 N|

teradata-load|tlog|     Schema is not all character data|

teradata-load|tlog|     Schema is not compatible with delimited data|

teradata-load|tlog|     Validating parsing case: 1200000.|

teradata-load|tlog|     SBCS (QUOTED DATA: No), Delimiter[0]: ''|

teradata-load|tlog|     Delimiter: x''|

teradata-load|tlog|     Escape Delimiter: x''|

teradata-load|tlog|     Open Quote: x''|

teradata-load|tlog|     Close Quote: x''|

teradata-load|tlog|     Escape Quote: x''|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     =                    > Log will include stats only <                     =|

teradata-load|tlog|     =                                                                        =|

teradata-load|tlog|     ==========================================================================|

teradata-load|tlog|**** 02:48:07 From file 'Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x.', starting to send rows.|

teradata-load|tlog|     AllowBufferMode: 'YES (defaulted)' (=1)|

teradata-load|tlog|**** 03:25:17 Finished sending rows for Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x. (index 1)|

teradata-load|tlog|     Rows sent: 142878 (of 142878), (size: 0) CPU Time: 0.18 seconds|

teradata-load|tlog|     Sending stat to infrastructure: DCCheckPointNo=0|

teradata-load|tlog|     Sending stat to infrastructure: DCRowsRead=142878|

teradata-load|tlog|     Sending stat to infrastructure: DCFilesRead=1|

teradata-load|tlog|     Sending stat to infrastructure: DCRowErrorNo=0|

teradata-load|tlog|     Sending stat to infrastructure: DCFileName='Y/tmp/*****_*****/LD-xxxxxxxx-xxxxxxxx-xxxx-xx-x.'|

teradata-load|tlog|     Files read by this instance: 1|

teradata-load|tlog|**** 03:26:38 Total processor time used = '0.18 Seconds(s)'|

teradata-load|tlog|**** 03:26:38 Total files processed: 1|

teradata-load|info||

teradata-load|info||

teradata-load|info|Getting operator log info from private log tptinsert|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =              TERADATA PARALLEL TRANSPORTER 64-BIT               =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =          SQL INSERTER OPERATOR     VERSION 15.00.00.03          =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =          OPERATOR SUPPORT LIBRARY VERSION 15.00.00.03           =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =           COPYRIGHT 2001-2014, TERADATA CORPORATION.            =|

teradata-load|tlog|     =                      ALL RIGHTS RESERVED.                       =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                       Process I.D.: 22255                       =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|**** 02:48:06 Processing starting at: Mon May 11 02:48:06 2015|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                      Module Identification                      =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|     64-bit SQL Inserter Operator for Linux release 2.6.32-279.el6.x86_64 on xxxxxxxx|

teradata-load|tlog|     InsMain    : 15.00.00.10|

teradata-load|tlog|     InsCLI     : 14.10.00.05|

teradata-load|tlog|     InsUtil    : 15.00.00.05|

teradata-load|tlog|     InsSess    : 14.10.00.01|

teradata-load|tlog|     InsLob     : 15.00.00.05|

teradata-load|tlog|     PcomCLI    : 15.00.00.39|

teradata-load|tlog|     PcomMBCS   : 14.10.00.02|

teradata-load|tlog|     PcomMsgs   : 15.00.00.01|

teradata-load|tlog|     PcomNtfy   : 14.10.00.05|

teradata-load|tlog|     PcomPx     : 15.00.00.08|

teradata-load|tlog|     PcomUtil   : 15.00.00.10|

teradata-load|tlog|     PXICU      : 15.00.00.03|

teradata-load|tlog|     TDICU      : 15.00.00.00|

teradata-load|tlog|     CLIv2      : 15.00.00.00   |

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                   Control Session Connection                    =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|**** 02:48:06 Connecting to RDBMS:    'xxxxx'|

teradata-load|tlog|**** 02:48:06 Connecting with UserId: 'xxx_xx_xxxx'|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                  Teradata Database Information                  =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|**** 02:48:06 Teradata Database Version:      '15.00.02.04'|

teradata-load|tlog|**** 02:48:06 Teradata Database Release:      '15.00.02.03'|

teradata-load|tlog|**** 02:48:06 Maximum request size supported: 1MB|

teradata-load|tlog|**** 02:48:06 Session character set:          'ASCII'|

teradata-load|tlog|**** 02:48:06 Data Encryption:                supported|

teradata-load|tlog|**** 02:48:06 Enhanced Statement Status Level: 1|

teradata-load|tlog|**** 02:48:06 Current working DATABASE set:   'xx_xxxx'|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                   Special Session Connection                    =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|**** 02:48:07 Maximum number of special sessions requested: 20|

teradata-load|tlog|**** 02:48:07 Minimum number of special sessions required:  1|

teradata-load|tlog| |

teradata-load|tlog|              Instance Assigned Connected Result                |

teradata-load|tlog|              ======== ======== ========= ======================|

teradata-load|tlog|                  1         1        1    Successful|

teradata-load|tlog|                  2         1        1    Successful|

teradata-load|tlog|                  3         1        1    Successful|

teradata-load|tlog|                  4         1        1    Successful|

teradata-load|tlog|                  5         1        1    Successful|

teradata-load|tlog|                  6         1        1    Successful|

teradata-load|tlog|                  7         1        1    Successful|

teradata-load|tlog|                  8         1        1    Successful|

teradata-load|tlog|                  9         1        1    Successful|

teradata-load|tlog|                 10         1        1    Successful|

teradata-load|tlog|                 11         1        1    Successful|

teradata-load|tlog|                 12         1        1    Successful|

teradata-load|tlog|                 13         1        1    Successful|

teradata-load|tlog|                 14         1        1    Successful|

teradata-load|tlog|                 15         1        1    Successful|

teradata-load|tlog|                 16         1        1    Successful|

teradata-load|tlog|                 17         1        1    Successful|

teradata-load|tlog|                 18         1        1    Successful|

teradata-load|tlog|                 19         1        1    Successful|

teradata-load|tlog|                 20         1        1    Successful|

teradata-load|tlog|              ======== ======== ========= ======================|

teradata-load|tlog|                Total      20       20    Successful|

teradata-load|tlog| |

teradata-load|tlog|**** 02:48:07 Creating insert request for the job|

teradata-load|tlog|     USING projectid(DECIMAL(10)),userid(CHAR(12)),type_td(VARCHAR(250)),deleted(CHAR(1)),instime(CHAR(19)),updtime(CHAR(19)),projectguid(VARCHAR(40)),summaryxml_blob(BLOB(2097088000) AS DEFERRED),summaryxml_clob(CLOB(2097088000) AS DEFERRED),subtype(VARCHAR(12)),detailxml(BLOB(2097088000) AS DEFERRED),detailxml_clob(CLOB(2097088000) AS DEFERRED) INSERT INTO xxxxx_xxx.x_xxxxxxx_xxxxxx (PROJECTID,USERID,TYPE_TD,DELETED,INSTIME,UPDTIME,PROJECTGUID,SUMMARYXML_BLOB,SUMMARYXML_CLOB,SUBTYPE,DETAILXML,DETAILXML_CLOB) VALUES (:projectid,:userid,:type_td,:deleted,:instime,:updtime,:projectguid,:summaryxml_blob,:summaryxml_clob,:subtype,:detailxml,:detailxml_clob);|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                     Column/Field Definition                     =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|     Column Name                    Offset Length Type      |

teradata-load|tlog|     ============================== ====== ====== ========================|

teradata-load|tlog|     projectid                           0      8 DECIMAL(10,0)|

teradata-load|tlog|     userid                              8     12 CHAR|

teradata-load|tlog|     type_td                            20    250 VARCHAR|

teradata-load|tlog|     deleted                           272      1 CHAR|

teradata-load|tlog|     instime                           273     19 CHAR|

teradata-load|tlog|     updtime                           292     19 CHAR|

teradata-load|tlog|     projectguid                       311     40 VARCHAR|

teradata-load|tlog|     summaryxml_blob                   353   1024 DEFERRED BLOB BY NAME|

teradata-load|tlog|     summaryxml_clob                  1379   1024 DEFERRED CLOB BY NAME|

teradata-load|tlog|     subtype                          2405     12 VARCHAR|

teradata-load|tlog|     detailxml                        2419   1024 DEFERRED BLOB BY NAME|

teradata-load|tlog|     detailxml_clob                   3445   1024 DEFERRED CLOB BY NAME|

teradata-load|tlog|     ============================== ====== ====== ========================|

teradata-load|tlog|     INDICATOR BYTES NEEDED: 2|

teradata-load|tlog|     EXPECTED RECORD LENGTH: 4473|

teradata-load|tlog|**** 02:48:08 Checkpoint complete. Rows inserted: 0|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                           Load Phase                            =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|**** 02:48:19 Starting to send data to the RDBMS|

teradata-load|tlog|**** 03:26:36 Checkpoint complete. Rows inserted: 142878|

teradata-load|tlog|**** 03:26:37 Checkpoint complete. Rows inserted: 142878|

teradata-load|tlog|**** 03:26:38 Finished sending rows to the RDBMS|

teradata-load|tlog| |

teradata-load|tlog|              Instance    Rows Sent             Rows Inserted|

teradata-load|tlog|              ========  ====================  ====================|

teradata-load|tlog|                  1                    45741                 45741|

teradata-load|tlog|                  2                    41561                 41561|

teradata-load|tlog|                  3                    28385                 28385|

teradata-load|tlog|                  4                    16433                 16433|

teradata-load|tlog|                  5                     7282                  7282|

teradata-load|tlog|                  6                     1639                  1639|

teradata-load|tlog|                  7                     1032                  1032|

teradata-load|tlog|                  8                      608                   608|

teradata-load|tlog|                  9                      197                   197|

teradata-load|tlog|                 10                        0                     0|

teradata-load|tlog|                 11                        0                     0|

teradata-load|tlog|                 12                        0                     0|

teradata-load|tlog|                 13                        0                     0|

teradata-load|tlog|                 14                        0                     0|

teradata-load|tlog|                 15                        0                     0|

teradata-load|tlog|                 16                        0                     0|

teradata-load|tlog|                 17                        0                     0|

teradata-load|tlog|                 18                        0                     0|

teradata-load|tlog|                 19                        0                     0|

teradata-load|tlog|                 20                        0                     0|

teradata-load|tlog|              ========  ====================  ====================|

teradata-load|tlog|                Total                 142878                142878|

teradata-load|tlog| |

teradata-load|tlog|**** 03:26:38 Number of bytes sent to the RDBMS for this job: 35839547419|

teradata-load|tlog| |

teradata-load|tlog|**** 03:26:38 Load Phase statistics:|

teradata-load|tlog|              Elapsed time: 00:00:38:19 (dd:hh:mm:ss)|

teradata-load|tlog|              CPU time:     101.84 Second(s)|

teradata-load|tlog|              MB/sec:       14.867|

teradata-load|tlog|              MB/cpusec:    335.617|

teradata-load|tlog| |

teradata-load|tlog|     ===================================================================|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     =                        Logoff/Disconnect                        =|

teradata-load|tlog|     =                                                                 =|

teradata-load|tlog|     ===================================================================|

teradata-load|tlog||

teradata-load|tlog|**** 03:26:38 Logging off all sessions|

teradata-load|tlog| |

teradata-load|tlog|              Instance      Cpu Time     |

teradata-load|tlog|              ========  ================ |

teradata-load|tlog|                   1       43.22 Seconds|

teradata-load|tlog|                   2       25.09 Seconds|

teradata-load|tlog|                   3       17.48 Seconds|

teradata-load|tlog|                   4       10.42 Seconds|

teradata-load|tlog|                   5        4.55 Seconds|

teradata-load|tlog|                   6        0.84 Seconds|

teradata-load|tlog|                   7        0.39 Seconds|

teradata-load|tlog|                   8        0.19 Seconds|

teradata-load|tlog|                   9        0.11 Seconds|

teradata-load|tlog|                   10        0.06 Seconds|

teradata-load|tlog|                   11        0.06 Seconds|

teradata-load|tlog|                   12        0.06 Seconds|

teradata-load|tlog|                   13        0.06 Seconds|

teradata-load|tlog|                   14        0.06 Seconds|

teradata-load|tlog|                   15        0.06 Seconds|

teradata-load|tlog|                   16        0.06 Seconds|

teradata-load|tlog|                   17        0.07 Seconds|

teradata-load|tlog|                   18        0.06 Seconds|

teradata-load|tlog|                   19        0.06 Seconds|

teradata-load|tlog|                   20        0.06 Seconds|

teradata-load|tlog| |

teradata-load|tlog|**** 03:26:38 Total processor time used = '102.96 Second(s)'|

teradata-load|tlog|     .        Start : Mon May 11 02:48:06 2015|

teradata-load|tlog|     .        End   : Mon May 11 03:26:38 2015|

teradata-load|tlog|     .        Highest return code encountered = '0'.|

teradata-load|tlog|**** 03:26:38 This job terminated|

teradata-load|end||

Gather_Logs.000||finish|End|

 

Please help. Thank you.

Forums: 

TPT Permission error

$
0
0

 

I have very weird TPT permission issue. When I trigger a TPT script from a Linux Centos Terminal the script succeeds. But if I trigger it  from our Enterprise scheduler (Opswise) the TPT script fails with the below error. Both ways the script is triggered with the same user. I have also ensured that the environmental variables between the two trigger mode matches. Can someone provide me pointers on how to debug this further.

I have very weird TPT permission issue. When I trigger a TPT script from a Linux Centos Terminal the script succeeds. But if I trigger it  from our Enterprise scheduler (Opswise) the TPT script fails with the below error. Both ways the script is triggered with the same user. I have also ensured that the environmental variables between the two trigger mode matches. Can someone provide me pointers on how to debug this further.

tpt_reader Instance 8 directing private log report to 'dtacop-optimus_app-29355-8'.
tpt_reader Instance 1 directing private log report to 'dtacop-optimus_app-29345-1'.
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader Instance 2 directing private log report to 'dtacop-optimus_app-29346-2'.
tpt_reader Instance 7 directing private log report to 'dtacop-optimus_app-29351-7'.
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19008 DataConnector Producer operator Instances: 10
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_8.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_2.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader Instance 4 directing private log report to 'dtacop-optimus_app-29348-4'.
tpt_reader Instance 10 directing private log report to 'dtacop-optimus_app-29358-10'.
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_7.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_4.txt' (Permission denied)'
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_10.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader Instance 3 directing private log report to 'dtacop-optimus_app-29347-3'.
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader Instance 5 directing private log report to 'dtacop-optimus_app-29349-5'.
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_3.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_5.txt' (Permission denied)'
tpt_reader Instance 9 directing private log report to 'dtacop-optimus_app-29356-9'.
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader Instance 6 directing private log report to 'dtacop-optimus_app-29350-6'.
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19003 NotifyMethod: 'None (default)'
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_9.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19003 ECI operator ID: 'tpt_reader-29345'
tpt_reader: TPT19434 pmOpen failed. General failure (34): 'pmUnxDskOpen: File: 'TPT_DCOP_RE_TMP_optimus_app_6.txt' (Permission denied)'
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19222 Operator instance 1 processing file '/var/groupon/tmp/tmpVnEUDK/sandbox.aff_gpn_lup1_temp_trans_chlr10'.
tpt_writer: connecting sessions
tpt_writer: preparing target table
tpt_writer: entering Acquisition Phase
tpt_writer: disconnecting sessions
tpt_reader: TPT19221 Total files processed: 0.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19003 TPT Exit code set to 12.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
tpt_reader: TPT19404 pmOpen failed. Requested file not found (4)
tpt_reader: TPT19304 Fatal error opening file.
invoke tpt: /opt/teradata/client/Current/tbuild/bin/tbuild -f /var/groupon/tmp/sandbox.aff_gpn_lup1_temp_trans_chlr1_05_13_UXbS16/import.scr -h 128M -j sandbox.aff_gpn_lup1_temp_trans_chlr1 -R 0 -z 0



 

 

 

 

Tags: 
Forums: 

Mload Error - UTY3403 Only one statement per line is allowed.

$
0
0

I have an Mload script which gives the following error. 

UTY3403 Only one statement per line is allowed.  Extra characters were detected beginning in column '43'.

 

I have verified the same script against my other scripts and its fine. There are no syntax problems. 

Is this error got something to do with the import data file format 

I have a  .xlsx file in which data to be inserted into table is present. I have 14 columns in the file and 15 column in the table . Hence I am casting the 15th column as date in my Mload file layout. Dont think this is an issue. 

 

Saved the data file as a .csv with pipe delimiter . Mload fails. 
Changed the delimiters to {Tab} , Pipe , Semicolon . All leading to same error. 
What else could be the trouble here ? 

 

Came to know from one of my acquaintance that changing the Layout name will eliminate the issue. 
Is the name of the layout an issue ?

Forums: 

TPT19003 Number of characters in column 20 (65537) exceeds maximum allowed (65536)

$
0
0

After TTU upgrade from 13 to 14 on unix, same tpt script is failing with below error which was running fine with earlier version
Script exports data from table to a pipe delimited file
Note: script runs fine if table has less records , say lesser than 500 records , but if same sript runs with 1000 records it fails
 Error:
FILE_WRITER: TPT19003 Number of characters in column 20 (65537) exceeds maximum allowed (65536)
 
Tbuild command:
tbuild -f tptscript -v exportjobvar.txt jobname
Below is the tpt script:

DEFINE JOB EXPORT_FROM_TABLE_TO_FILE
DESCRIPTION 'EXPORT DATA FROM A TABLE TO FILE'
(
   DEFINE SCHEMA TABLE_SCHEMA
   DESCRIPTION 'TABLE SCHEMA'   (
INCLUDE @inschema;
   );

   DEFINE OPERATOR FILE_WRITER()
   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'   TYPE DATACONNECTOR CONSUMER
   SCHEMA TABLE_SCHEMA
   ATTRIBUTES
   (
      VARCHAR PrivateLogName    = @Filewriterlog,
      VARCHAR FileName          = @FileName,
      VARCHAR OpenMode          = 'Write',
      VARCHAR Format            = 'Delimited',
      VARCHAR TextDelimiter     = '|'   );

   DEFINE OPERATOR EXPORT_OPERATOR()
   DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'   TYPE EXPORT
   SCHEMA TABLE_SCHEMA
   ATTRIBUTES
   (
      VARCHAR PrivateLogName    = @exportop,
      INTEGER MaxSessions       =  32,
      INTEGER MinSessions       =  1,
      VARCHAR UserName          = 'user',
      VARCHAR UserPassword      = 'pwd',
      VARCHAR AccountId,
                        INCLUDE @selschema;
   );

   STEP EXPORT_2_FILE
   (
      APPLY TO OPERATOR (FILE_WRITER() [1] )
      SELECT * FROM OPERATOR (EXPORT_OPERATOR() [1] );
   );

);

 

Forums: 

Join on Non indexed columns

$
0
0

Hello ,
I have an interview question ..
What will happen internally in teradata when we try to join 2 tables to find the matched rows on EMP_Name column where EMP_ID column is index but not EMP_Name  ?
Is there anything will happend at AMP's side in processing anything like rearrange data  ? and what is rearrange in teradata ?
Thanks,
Sridhar.

Forums: 

bteq

$
0
0

hi anyone there for online training on teradata performance tuning

Forums: 

Perm Space Value in BTEQ Export Report file incorrect

$
0
0

Hi All,
I am trying to write a simple script to report tables which have not been used since last 90 days. i am using below code to get the values (some specific tables have been renamed)

.SET WIDTH 254;

.EXPORT REPORT FILE= unused_tables_VD.txt

SELECT CAST(FINAL1.DATABASENAME AS CHAR(40)) (TITLE 'DATABASENAME'),CAST(FINAL1.TABLENAME AS CHAR(50))(TITLE 'TABLENAME'),FINAL1.LAST_ACCESS (TITLE 'LAST_ACCESS'),FINAL1.SPACE_IN__GB (TITLE 'SPACE_IN_GB'),T.CREATORNAME(TITLE 'CREATOR')

FROM

 

(

SELECT OAA.DATABASENAME,OAA.TABLENAME,OAA.FIRST_ACCESS, OAA.LAST_ACCESS,

SUM(TS.CURRENTPERM/1024/1024/1024) AS SPACE_IN__GB

FROM

 
but when report is generated i am getting it is generating something like below, where i am getting all as expected, only the sapce_IN_GB (space occupied by table) showing some unusual value. i am not able to find if it is BTEQ issue or i hvae to use some function to get correct value.

DATABASENAME                              TABLENAME                                           LAST_ACCESS                           SPACE_IN_GB                            CREATOR

----------------------------------------  --------------------------------------------------  -----------  ----------------------  ------------------------------

XXX_XXX_XXX                               123abc                                                   2015/01/18                      2.72951547622681E 002                      user1

XXX_1234_XXX                             abcdefg                                                   2015/01/18                      1.59725086212158E 002                      user2

please see SPACE_IN_GB, the first value is actually 273 GB,is there something missed?

Forums: 

TDGeoimport on Linux

$
0
0

Hi everyone! I've been using the TDGeimport Tool on Windows but I have to migrate all the ETL process to Linux, and now I have to Install TDGeoimport on Linux, is there a version for Linux? Or there is one single version that works with Windows and Linux?

 

If there is only one single version, does anyone knows how to install it?

 

Thanks for the help!

Forums: 

Question reg. TTU Support for RHEL on PowerPC platform

$
0
0

Hi,
From Product support matrix.TTU is currently not supported for the RHEL on PowerPc platform.
Is there any Roadmap planned to support RHEL on PowerPc architecture?
Thanks.

Forums: 

data provider for Ole DB

$
0
0

I know that functional updates are not being added to the data provider for OLEDB but I was curious if it was being updated for 15.10 and if it will be in the same path as the other executables/libraries.

Tags: 
Forums: 

Teradata Studio and Teradata Analyst Pack

$
0
0

Hello everybody,

I would like to investigate possibilities of Teradata Visual Explain and I'm using Teradata Express Studio (v. 15.00) to connect our Teradata installation. To use the tool, I need Teradata Analyst Pack. Is the tool available for partners, and if yes, how can we get access to it?

Many thanks!

Forums: 

Need explanation on an unexpected error raised by the SOAP application.

$
0
0

Hi,

I encountered an "unexpected" error when using your SOAP application.

Error below:

> An unexpected error occurred (Error ID: api01/1432714249395). Please retry your request.

How know what happens, in order to fix it ?

Tags: 
Forums: 

TPT Inserter Operator

$
0
0

Hi,
Does TPT SQL Inserter operator supports error limit attribute?. Looking at the manual it seems it doesn't. But Stream operator does seems to support it. currently we are looking for alternatives to Load operator for low volume imports but our alternative solution must support error limit and skip row features.
 
Thanks,
Charles

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>