Hi ,
Can anyone explain me whether TPT Stream Operator(TPUMP) can be used to load data from file to table and table to file. If it is possible could you kindly provide the script for the same.
In my view, TPT Stream Operator is best suitable for mini batch loads. Could you confirm this also please.
Thank you,
Ramkumar
TPT Stream Operator
Convert a Field
Hello,
I have a CHAR(12) field in a table. This field is a last 12 month field, so it provides 12 characters representing each of the past 12 month. Problem is, the 12 characters in this field could be blank. So when I go to export, it gets all messed up. Is there a way to look into a field and convert any blanks into a value; say; x ?
Field will look like this:
'1 12 1 1'
' 11 '
I want to see this:
'1xxx12xxx1x1'
'xxxxxxxxx11x'
Mload failing when trying to load file with þ (thorn) as delimiter
Hi All,
Not sure what I'm doing wrong but I'm trying to load a thorn (þ) delimited file. Here is a sample record
"field1"þ"field2"þ"field3"þ"field4"þ"field5"þ"field6"þ"field7"þ"field8"þ"field9"þ"field10"þ"field11"þ"field12"þ"field13"þ"field14"þ"field15"
Multiload script follows :
.logtable test.testtbl_log; .logon 192.168.201.133/dbc,xxxxxxxxxx; drop table test.testtbl_et; drop table test.testtbl_ut; drop table test.testtbl_uv; drop table test.testtbl_wt; .begin import mload tables test.testtbl SESSIONS 20; .layout InputFile_layout; .field DateStr * VARCHAR(255) ; .field Event * VARCHAR(255) ; .field ToolbarId * VARCHAR(255) ; .field UserId * VARCHAR(255) ; .field Username * VARCHAR(255) ; .field UserEmail * VARCHAR(255) ; .field FirstName * VARCHAR(255) ; .field LastName * VARCHAR(255) ; .field Browser * VARCHAR(255) ; .field BrowserVersion * VARCHAR(255) ; .field OS * VARCHAR(255) ; .field ToolbarVersion * VARCHAR(255) ; .field SearchSource * VARCHAR(4000) ; .field ClickType * VARCHAR(4000) ; .field ClickUrl * VARCHAR(4000) ; .dml label TableName_InsertDML; insert into test.testtbl ( DateStr ,Event ,ToolbarId ,UserId ,Username ,UserEmail ,FirstName ,LastName ,Browser ,BrowserVersion ,OS ,ToolbarVersion ,SearchSource ,ClickType ,ClickUrl ) values ( trim(both from trim(both '"' from :DateStr ) ) ,trim(both from trim(both '"' from :Event ) ) ,trim(both from trim(both '"' from :ToolbarId ) ) ,trim(both from trim(both '"' from :UserId ) ) ,trim(both from trim(both '"' from :Username ) ) ,trim(both from trim(both '"' from :UserEmail ) ) ,trim(both from trim(both '"' from :FirstName ) ) ,trim(both from trim(both '"' from :LastName ) ) ,trim(both from trim(both '"' from :Browser ) ) ,trim(both from trim(both '"' from :BrowserVersion) ) ,trim(both from trim(both '"' from :OS ) ) ,trim(both from trim(both '"' from :ToolbarVersion) ) ,trim(both from trim(both '"' from :SearchSource ) ) ,trim(both from trim(both '"' from :ClickType ) ) ,trim(both from trim(both '"' from :ClickUrl ) ) ); .import infile testtbl_all.csv format vartext '00FE'xc display errors layout InputFile_Layout apply TableName_InsertDML; .end mload; .logoff;
I get the following error:
"field1"├╛"field2"├╛"field3"├╛"field4"├╛"field5"├╛"field6"├╛"field7"├╛"field8"├╛"field9"├╛"field10"├╛"field11"├╛"field12"├╛"field13"├╛"field14"├╛"field15" **** 14:03:52 UTY4014 Access module error '61' received during 'pmReadDDparse' operation: 'Warning, too few columns !ERROR! Delimited Data Parsing error: Too few columns in row 1'
TPT Loader and JSON files
A couple of questions.
First, when loading the Teradata Parallel Transport Wizard, I receive the following error message:
Exception in thread "main" java.long.UnsatisfiedLinkError: C:\Program Files (x86)\Teradata\15.00\Teradata Parallel Transporter\bin\wwhelp.dll: Cant load a IA 32 bit.dll on a AMD 64 bit Platform.
The rest of the error is in the File.
Second, is the parallel Transport the only way to get JSON files in Teradata?
Thanks,
Philip
Transfering data from one server to another
Hi,
Currently, I am trying to transfer data from one data table in TD(teradata) server1, to another TD server 2.
Is there any way that I can directly copy this data from server 1 to server 2? Right now I'm using fast export to write a file to my hard drive, and then load the csv file into the other server 2. Is there a better way to do this? My exporting is taking FOREVER! I would like to speed up the process if possible.
Thanks,
How to extract the count for the odbc source records in TPT
I am trying to provide some assertions to users based on the number of records read from the source and number of records loaded into TD.
Is there any diagnostics functions present in TPT to extract the count of source records processed and number of records loaded into TD.
I have some details in the log file like these below .
LOYALTY_REDEMPTION_INFO_odbc: sending SELECT request
LOYALTY_REDEMPTION_INFO_odbc: data retrieval complete
LOYALTY_REDEMPTION_INFO_odbc: Total Rows Exported: 576369
LOYALTY_REDEMPTION_INFO_updt: entering Application Phase
LOYALTY_REDEMPTION_INFO_updt: Statistics for Target Table: 'LOYALTY_REDEMPTION_STG'
LOYALTY_REDEMPTION_INFO_updt: Rows Inserted: 576369
Can anyone suggest how can I get these highlighted values without parsing the logs?.
How to extract the count for source & target records in TPT
Good Morning TPT experts!
I need some help to extract counts from TPT.
I am trying to provide some assertions to users based on the number of records read from the source and number of records loaded into TD.
Is there any diagnostics functions present in TPT to extract the count of source records processed and number of records loaded into TD.
I have some details in the log file like these below .
LOYALTY_REDEMPTION_INFO_odbc: sending SELECT request
LOYALTY_REDEMPTION_INFO_odbc: data retrieval complete
LOYALTY_REDEMPTION_INFO_odbc: Total Rows Exported: 576369
LOYALTY_REDEMPTION_INFO_updt: entering Application Phase
LOYALTY_REDEMPTION_INFO_updt: Statistics for Target Table: 'LOYALTY_REDEMPTION_STG'
LOYALTY_REDEMPTION_INFO_updt: Rows Inserted: 576369
Can anyone suggest how can I get these highlighted values without parsing the logs?.
TeraGSS Security Library [115022] Exception occured in TERAGSS layer. See inner exception
Gettting this exception whenever try to open the connection.
Teradata.Client.Provider.Tdexception (0x80004005): [TeraGSS Security Library] [115022] Exception occured in TERAGSS layer. See inner exception for details. --> System.NullReferenceException: Object reference not set to an instance of an object, at Teradata.Client.Provider.WpSession.CreateSecureBuffer(Buffer buffer, UInt32 length, Boolean privacy) at Teradata.Client.Provider.WpSession.Send(WpMessage message, Buffer buffer, Boolean encryption, Boolean integrity, Boolean unicodecharset, Boolean allowSessionStateChange) at Teradata.Client.Provider.WpSession.Send(WpMessage message, Buffer buffer, Boolean unicodecharset) at Teradata.Client.Provider.WpMessageManager.Send(WpMessage message) at Teradata.Client.Provider.WpConfigManager.Action() at Teradata.Client.Provider.WpSession.InvokeConfigManager(In32 connectionTimeout, Utilstopwatchwrapper watch, Buffer buffer, ParcelFactory pclFactory) at Teradata.Client.Provider.WpSession.InvokesessionInitializerManagers(Boolean reconSession, Int32 connecitonTimeout, String password, Utlstopwatchwrapper watch, ParcelFactory& pclFactory, Buffer& buffer, String& logonstring, string& dbsVersion, WpConfigManger& configManager) at Teradata.Client.Provider.WpSession.InvokeOpenManager(Int32 connectionTimeout, String password, UtlStopwatchwrapper watch) at Teradata.Client.Provider.WpSession.Open(Int32 connectionTimeout, String password) at Teradata.Client.Provider.ExeContext 3.Open(Int32 timeout, String password) at Teradata.Client.Provider.Connecttion.open(utlConnectionString connectionstring UInt32 timeout) at Teradata.Client.Provider.ConnectionFactory.GetConnection(object owningobject, UtlConnectionString connStr) at Teradata.Client.Provider.TdConnection.Open() at JDWSecurity.ConnectToTeradata()
Teradata SQL Assistant 14 - History Window keeps moving
Is there a way to lock the history window in SQL assistant to the right bottom corner of the application just like in the older versions if Queryman?
Teradata ODBC Driver For Linux - Urgent Help Requested
Hi Team,
I am facing issue after installing the TD 14.10 ODBC Driver.
Installation Sequesnce :-
rpm -ivh tdicu/tdicu-14.10.00.04-1.noarch.rpm
rpm -ivh TeraGSS/TeraGSS_linux_x64-14.10.07.01-1.noarch.rpm
rpm -ivh tdodbc/tdodbc-14.10.00.09-1.noarch.rpm
My .Profile :-
#For TD Driver
export TDROOT=/opt/teradata/client
export TDHOME=$TDROOT/14.10
export TPTHOME=$TDHOME/tdicu
export NLSPATH=$NLSPATH:$TDHOME/odbc_64/msg/%N
export TD_ICU_DATA=$TDHOME/tdicu/lib64
#ODBC Connection
export ODBCHOME=$TDROOT/odbc
export ODBCINI=/opt/teradata/client/14.10/odbc_64/odbc.ini
export ODBCINST=/opt/teradata/client/14.10/odbc_64/odbcinst.ini
#Other Imp Paths Updated For TD
PATH=$PATH:/opt/teradata/client/14.10/odbc_64/lib:/opt/teradata/client/14.10/odbc_64/bin
export PATH=$PATH:${TDHOME}/odbc_64/samples/C
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$ODBCHOME/lib:$TDHOME/lib64:/usr/lib:$TDHOME/tdicu/lib64
My ODBC.INI File :-
[TDODBC]
Driver=/opt/teradata/client/14.10/odbc_64/lib/tdata.so
Description=Teradata database
DBCName=10.212.200.133
LastUser=
Username=abc
Password=abc@123
Database=
DefaultDatabase=
Testing :-
[root@ftvlsldcbmacdc01 C]# ./adhoc
Enter Data Source Name: TDODBC
Enter UserID: abc
Enter Password:
Connecting with SQLConnect(DSN=TDODBC,UID=DBU_ETL,PWD=*)...
adhoc: (SQL Diagnostics) STATE=28000, CODE=0, MSG=[Teradata][ODBC Teradata Driver] Not enough information to log on
ODBC connection closed.
Can you please help me know how to resolve this issue " Not enough information to log on". This is somewhat became urgent now, i already lost number of days trying to find the reason.
Thanks & Regards,
Shubhendu
Errorcode 3798
Hi All,
I'm getting 3798 errorcode while running fast-export script. Fast export script contains around 400 columns and have used 25 columns as a group and then concatenate it with another group of 25 columns and so on and as a result getting 3798 error. I tried diving the query into two but again it is very time consuming process and have to rename all columns in order to join on PI. Also I need to get Col1-Col200 from 1st derived table and Col2-Col200 from next table. Please find below a snipet of the query.
SELECT DISTINCT
('¿'||'|'||
COALESCE(TRIM(COL1),'')||'|'||COALESCE(TRIM(COL2),'')||'|'||COALESCE(TRIM(COL3),
'')||'|'||COALESCE(TRIM(COL4),'')||'|'||COALESCE(TRIM(COL5),
'')||'|'||COALESCE(TRIM(COL6),'')||'|'||COALESCE(TRIM(COL7),
AND SO ON,
'')||'|'||'¿'
) (CHAR(63604))
FROM TBL;
Receive an error message while trying to use fast load
Hi, I receive error message 8017: The UserId, Password or account is invalid while trying to load data using fast load. All the files are saved in Desktop.
********** Fast Load Script**************
logon 10.61.59.93/796207,Wpwp123;
drop table DATAMDL_SNDBX.QA_FL_PD;
drop table DATAMDL_SNDBX.ERROR_TABLE_ucv;
drop table DATAMDL_SNDBX.ERROR_TABLE_TV;
CREATE SET TABLE DATAMDL_SNDBX.QA_FL_PD ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
NAME VARCHAR(10) CHARACTER SET LATIN NOT CASESPECIFIC,
INITIAL VARCHAR(10) CHARACTER SET LATIN NOT CASESPECIFIC)
PRIMARY INDEX CRO_FLIGHT_LEG_DEP_NUPI ( NAME );
SET RECORD VARTEXT'~';
DEFINE
NAME (VARCHAR(10)),
INITIAL (VARCHAR(10))
FILE = C:\Users\Scarlet\Desktop\FL_Data.TXT;
BEGIN LOADING DATAMDL_SNDBX.QA_FL_PD ERRORFILES teradata fastload.ERROR_TABLE_UCV, teradata fastload.ERROR_TABLE_TV;
INSERT INTO DATAMDL_SNDBX.QA_FL_PD
VALUES (:NAME,
:INITIAL);
END LOADING;
LOGOFF;
*******End of fast load script***********
********File containing data (with only 1 record)**********
NAME~INITIAL
PRASHANT~PD
**********Error Message****************
C:\Windows\system32>cd\
C:\>fastload<C:\Users\Scarlet\Desktop\FL_Script.TXT
===================================================================
= =
= FASTLOAD UTILITY VERSION 14.10.00.03 =
= PLATFORM WIN32 =
= =
===================================================================
===================================================================
= =
= Copyright 1984-2013, Teradata Corporation. =
= ALL RIGHTS RESERVED. =
= =
===================================================================
**** 14:08:03 Processing starting at: Thu Mar 10 14:08:02 2016
===================================================================
= =
= Logon/Connection =
= =
===================================================================
0001 logon 10.61.59.93/796207,
**** 14:08:03 RDBMS error 8017: The UserId, Password or Account is
invalid.
**** 14:08:03 Unable to log on Main SQL Session
**** 14:08:03 FastLoad cannot continue. Exiting.
===================================================================
= =
= Exiting =
= =
===================================================================
**** 14:08:03 Total processor time used = '0.124801 Seconds'
. Start : Thu Mar 10 14:08:02 2016
. End : Thu Mar 10 14:08:03 2016
. Highest return code encountered = '12'.
**** 14:08:03 FDL4818 FastLoad Terminated
C:\>
Teradata Metadata Manager
Has the Teradata Metadata Services been discontinued since TD 14.10?
Where can I download it , if its available?
TPT : multiple sources in an operator ?
Hello everyone,
I tried to find if it was possible to have two sources in the selectStmt of an UPDATE operator (SELECT in oracle WHERE in teradata in my case), or imbricate two operators ?
I have found nothing so I guess it's impossible but I preferred to ask before giving up.
I have two other questions : what is the maximum size of a JOB variable ? Is it possible to put the result of a query into a Job variable ?
Thanks,
py
TPT15.10. The need for TWB_ROOT ?
Hello,
We are getting errors when running TPT 15.10 on Solaris 10. Output is :
Teradata Parallel Transporter Version 15.10.00.04 64-Bit
TPT_INFRA: TPT02002: Error: TWB_ROOT environment variable is not defined.
However, The TPT 14.10 User Guide says: "Replaced TWB_ROOT with the following variable: TPT_install_directory."
The TTU 15.10 Install Guide for Solaris doesn't mention it, referring instead to the TPT_install_directory.
I believe we have installed correctly, according to the install guide. tbuild is in the path and appears to be starting (until the error)
The 15.10 Messages reference suggests that it is still required (TPT02002)
So, is TWB_RROT required as an environment variable or not ? Is an Incident warranted (doco or TTU)?
Thanks
SSIS Error: The Teradata TPT registry key cannot be opened.
I have a SQLSERVER 2012 database running on a Windows Server 2012 platform. I have installed the Microsift Connector for Attunity Version 2.0 and in SSIS I now have Teradata Source and Destination tools available. I have created a connector to the database and table I which to use in my SSIS package. The connector works and it allows me to preview the data just fine.
But, when I run the package, it fails with:
The TeraData TPT registry key cannot be opened. Verify that the TPT API 12.0 or 13.0 edition 2 (13.0.0.2) for Windows x86 is installed properly.
I know this is installed and I can see the registry key. Can anyone help me determine why this is happening and how I can fix it? Is there something else I ned to load?
Usage of multiple sessions by Teradata utilities like FL/ML/FE/TPT job ?
Hi All
Can someone explain working of multiple sessions used by Fastload/Multiload/TPT/FastExport job ?
I understand that max no. of sessions by a job should not be greater than total no. of amps in the system.
For example,
Let say one particular FastExport job is using 10 sessions. Expected rows to be exported are 100
Then how does workload is shared between multiple sessions?
Does each session exports 10 rows?
In case one session is doing all the export processing then why remaining sessions are required?
Thanks
Sanket
Tdload failing with TPT02638: Error: Conflicting data length
We are trying to use tdload to export data from one system and load it to another in same job.It is throwing error like Conflicting data length for column, source column length is (16) target column length(8). The data type for this column is decimal(30,0) on both source and target systems. However while we are TPT version 14 this used to work, one week back we upgraded to TPT 15.10.1 where as our databae verison is still 14.10.7.1. So is this upgrade is causing issue or is there any way we can rectify this? Isaw in other forums to increase maxdecimaldigits to 38, Is this possible with tdload?
Thanks,
Mani
Teradat TPT attributes to export data to a delimited file where data from individual column is enclosed in quotes
Hello
Would someone be able to help me out in what attributes should be added to the TPT script to extract the data with a delimiter and the values of individual column are enclosed in single quotes.
Teradata SQL Assistant
How can i convert the below SAS function to TEradata
input(substr(scan(dirct_advtsg_src_cd, 5, "-"),1,4),8.) >= 700
dirct_advtsg_src_cd = 123_234_456_523_708000_7896564