i downloaded 15.0 TTU, opened the zip and readme, product list all have 14.10. is this an oversight or the wrong package? (before i install). thanks
TTU 15.0 download yet readme and product list refers to 14.10
TPT User Templates
I want to create and test some user templates, which I'm going to base on the Teradata provided templates $LOAD and $EXPORT.
Do the templates I create have to be saved to the same directory as the ones that are installed with the product? This directory on Linux only has write permission by root, so I would have to have root access to be able to write to that directory and test the templates. This would need to be done by a Linus systems administrator, who is unlikely to want to install a template that hadn't been tested.
Ideally I would be able to save the user written templates to a different directory so that if, for example, we reinstall TPT our templates do not get overwritten, and also so we can change the user written templates without needing root permission.
Is this possible?
Is anyone else using user written templates? If so how do you manage testing and implementing them?
TPT scripting: Escaping characters in Atribute fields
Hello guys, I'm new to TPT and I run into
Here is a snippet of my TPT code
DEFINE OPERATOR OS_OPERATOR DESCRIPTION 'OS CMD OPERATOR FOR TPT' TYPE OS COMMAND ATTRIBUTES ( VARCHAR PrivateLogName = 'OS_CMD_log', VARCHAR OsCmd = 'awk -F "[/,]"'NR>1{printf("%s,%s,%s,%02d/%02d/%04d,%02d/%02d/%04d\n",$1,$2,$3,$4,$5,$6,$7,$8,$9)}' in_file > out_file', VARCHAR IgnoreError = 'NO' );
Basically, I'm trying to run this awk command that contains special characters that I assume don't work well with TPT ('" $)
awk -F "[/,]"'NR>1{printf("%s,%s,%s,%02d/%02d/%04d,%02d/%02d/%04d\n",$1,$2,$3,$4,$5,$6,$7,$8,$9)}' in_file > out_file
How should I write my OsCmd attribute in order to escape these special characters ?
A general answer of how to handle attribute values that contain special characters is most welcomed :)
Custom Message in Bteq
Hi All,
I want a customized error output message for each statement failure in bteq. I tried with the below code but getting error
*** Error: Illegal value "MSG" specified. Notify ignored.
insert into target_table1 sel * from source;
.if errorcode<>0 then .notify msg 'Bteq failed in table1 insert'; .quit errorcode;
insert into target_table2 sel * from source;
.if errorcode<>0 then .notify msg 'Bteq failed in table2 insert'; .quit errorcode;
is there any other way I can give a custom error message for each failure to findout the failure step exactly ?
Thanks,
Ambuj
TPT Extract not working for column name starting with FROM_
I got the below mentioned error when I had column name starting with FROM_ in the table.
TPT_INFRA: TPT02932: Error: TPT_INFRA: TPT02934: Error: invalid token
near line 193 (text was '_')
line 193: syntax error at "FROM" missing { REGULAR_IDENTIFIER_ EXTENDED_IDENTIFIER_ EXTENDED_IDENTIFIER_NO_N_ } in Rule: Regular Identifier
TPT_INFRA: TPT03022: Error: Syntax error occurred in parse rule Rule: Column Definition
TPT_INFRA: TPT03050: Error: Semantic error at or near job script line 233:
Schema 'SCHEMA_MBL_CARD_EVT_Config_dat_extract' is undefined.
Schema cannot be resolved.
TPT_INFRA: TPT03050: Error: Semantic error at or near job script line 474:
Schema 'SCHEMA_MBL_CARD_EVT_Config_dat_extract' is undefined.
Schema cannot be resolved.
Compilation failed due to errors. Execution Plan was not generated.
Job script compilation failed.
When I removed the column from the script, it just worked fine.
Is there anything I need to take care in order to make it work.
I know that I can have alias in the schema and export operator select. That is the work around I found but shouldnt tpt work for any columnname?
which Teradata process is using this table
Issues in querying Informix Database using Teradata SQL assistant via ODBC
I am trying to query an Informix database via an ODBC created on IBM Informix ODBC driver version 4.10.00.15364 from Teradata SQL assistant.
I am getting all blanks in output where the field type is varchar or char in source though the source database has information in those columns. Even when I export data in a flat file, the varchar/char columns come as blank. The version of Teradata SQL assistant is 14.10.0.02
Is there any resolution available?
Visual Studio Extension (Does not show * for unsaved files)
Hello,
In Visual Studio the typical way a developer knows if a file has not been saved is by the "*" sign in the tab. The extension created by Teradata clears the "*" when an execute is performed. It would be nice if Teradata followed the Visual Studio standard in regards to this and leave the "*' in the tab until the file is actually saved.
Kind regards
OMD API error
We recently upgraded our TTU client on a Windows Data Services server from 14.00 to 14.10. The TPT programs are patched to 14.10.00.05.
The server is Windows Server 2008 Enterprise SP1 64-bit
The first TPTLoad tests produced the following warning:
TPT_INFRA: TPT04190: Warning: OMD API failed to initialize
The entire out file consists of:
C:\Program Files (x86)\Teradata\Client\14.10\Teradata Parallel Transporter\logs>
tlogview -l STAGE_CORP_OAS_T_OAS_MATL_BUS_IMPACT_S1-2.out
TPT_INFRA: TPT04101: Warning: Teradata PT cannot connect to Unity EcoSysetm Mana
ger.
The job will continue without event messages being sent to Unity EcoSystem Manag
er.
TPT_INFRA: TPT04190: Warning: OMD API failed to initialize
Teradata Parallel Transporter Coordinator Version 14.10.00.05
Teradata Parallel Transporter Executor Version 14.10.00.05
Teradata Parallel Transporter Executor Version 14.10.00.05
Teradata Parallel Transporter Load Operator Version 14.10.00.05
load_opt: private log specified: pl_OAS_MATL_BUS_IMPACT_S1
Teradata Parallel Transporter file_opt: TPT19006 Version 14.10.00.05
file_opt: TPT19206 Attribute 'TraceLevel' value reset to 'Statistics Only'.
file_opt Instance 1 directing private log report to 'dtacop-boetlqas-2708-1'.
file_opt: TPT19003 NotifyMethod: 'None (default)'
file_opt: TPT19008 DataConnector Producer operator Instances: 1
file_opt: TPT19003 ECI operator ID: 'file_opt-2708'
file_opt: TPT19222 Operator instance 1 processing file '\\.\pipe\DS_TD_TGT_STAGE
_T_OAS_MATL_BUS_IMPACT_S1_0.dat'.
load_opt: connecting sessions
load_opt: The job will use its internal retryable error codes
load_opt: preparing target table
load_opt: entering Acquisition Phase
Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(APPLY_1[0001]): checkpoint completed, status = Success
Task(SELECT_2[0001]) ready to checkpoint
Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(APPLY_1[0001]): checkpoint completed, status = Success
Task(SELECT_2[0001]) ready to take the EOD checkpoint
Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(APPLY_1[0001]): checkpoint completed, status = Success
load_opt: entering Application Phase
load_opt: Statistics for Target Table: '"STAGE_CORP_OAS_T"."OAS_MATL_BUS_IMPACT
_S1"'
load_opt: Total Rows Sent To RDBMS: 5
load_opt: Total Rows Applied: 5
load_opt: Total Rows in Error Table 1: 0
load_opt: Total Rows in Error Table 2: 0
load_opt: Total Duplicate Rows: 0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 0,
Total Rows Sent = 5
file_opt: TPT19221 Total files processed: 1.
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 5,
Total Rows Sent = 0
load_opt: disconnecting sessions
load_opt: Total processor time used = '0.703125 Second(s)'
load_opt: Start : Fri Sep 05 15:20:23 2014
load_opt: End : Fri Sep 05 15:20:41 2014
Job step MAIN_STEP completed successfully
Job STAGE_CORP_OAS_T_OAS_MATL_BUS_IMPACT_S1 completed successfully
Job start: Fri Sep 05 15:20:19 2014
Job end: Fri Sep 05 15:20:41 2014
Total available memory: 20000000
Largest allocable area: 20000000
Memory use high water mark: 270264
Free map size: 1024
Free map use high water mark: 20
Free list use high water mark: 0
Everything appears to have loaded correctly. Do we need to be concerned with this warning message?
Thanks,
Joe
10061 WSA E ConnRefused: The Teradata server is not accepting connections
Hi,
I downloaded td-ttu-15.00_for_Windows on my personal computer and created a DSN using my public ip over internet as IP server for DSN.
While connecting the DSN on TDA i am getting "10061 WSA E ConnRefused: The Teradata server is not accepting connections".
Thanks,
Nagendra
Installing TD 15.0
I am running into a problem with downloading TD Studio 15 in that the 2nd step asks for my JRE directory. I know that I have JAVA 1.7, but I cannot find the right directory to move the install forward. Please advise.
How to find the number of columns in each table in Teradata ??
Easy Loader Target Working Database
Hello:
I posted a few weeks ago here just for some background.
http://forums.teradata.com/forum/tools/easy-loader-issue
I'm finding I can't use Easy Loader to load a Target table unless that table is created under my default logon directory and that table is qualified in the command line with my default database.
If I qualify my target table (-t) with another database name or if I qualify my target table (-t) with another database name and also set the working database to that database (--TargetWorkingDatabase) I get a "2583: Worktable is missing" error...
The only way I've been able to load a table from another Teradata table using Easy Loader is by explicity setting the database qualifier to my default database (-t switch) and having the table created inside of this database. My load happens to be taking a table from one Teradata system and moving it to another Teradata system (tables are not on the same system).
Thanks for listening again!
Teradata FLOAD error table 1 creates unwanted newlines in Dataparcel data.
I have a fastload utility to load data from txt file into Teradata table. I am trying to get the data from Teradata error table 1 into a structured format in a separate file. But Currently, the data generated from each Data Parcel for each row inserts a new line character after every date type data.
For eg: If my source file has following data
200239|Wk39|200209|October|20023|Quarter3|2002|2002|2002-10-27|2002-11-02|2002-10-06|2002-11-02|2002-08-04|2002-11-02||2003-02-01|||||||
The export data from error table generates it into following format.
200239|Wk39|200209|October|20023|Quarter3|2002|2002|2002-10-27
|2002-11-02
|2002-10-06
|2002-11-02
|2002-08-04
|2002-11-02
||
2003-02-01|||||||
I added the delimiters manually using find/repace sed commands.
My query is, what is the way to obtain the data from Fload data parcel as it was in the original source file. Why are these extra new lines being added into the data??
Query not working in Teradata SQL Assistant with Teradata .Net while the same query is working with ODBC connection & vice versa
I am facing an issue where i have a query which has the WHERE clause defined as WHERE sample_dt <> '01/01/2010' (that is with format 'DD/MM/YYYY'). This query when run in TD Sql Assistant with ODBC connection provider was running perfectly but the same query when with Teradata .Net provider it was failing with below error:
SELECT Failed. 3535: A character string failed conversion to a numeric value.
Hence i changed the format of the DATE value as '2000-01-01' and the query ran successfully for some users with both the connectivities (ODBC & .Net).
But on further run of the query by other users I have noted that this same modified query was NOT working with ODBC connection but it worked with Teradata .Net connection. Now my question is, with the same environment for the users on versions of the tool and the connectivity providers given below, why are we noticing the difference in execution of the query.
System type: 32-bit operating system.
Database version: Teradata 14.10.01.07
TD Sql Assistant version: 13.11.0.04
ODBC Provider Version : ODBC 13.10.00.06
Teradata .Net Provider Version : Teradata .Net 13.11.0.1
Further when i changed the DATE format in underlying table definition from
sample_dt DATE FORMAT 'DD/MM/YYYY' to sample_dt DATE FORMAT 'YYYY-MM-DD', the query ran perfectly for all the users.
But I wanted to know without the change in formats in table definition, why are we noticing these errors for users using the similar environment as stated above. Please explain.
I assume that ANY Teradata Query SHOULD work on both ODBC and .Net provider. With this change does it mean that there are few features that will not be supported with Teradata .Net connectivity or with ODBC connectivity. If so, can someone tell me what other features will not work and what care should be taken while writing my queries to ensure that they work with both the connections.
BTEQ Data Truncation while Exporting
Hello,
Seems like the following is a common issue that I have been facing Truncation while exporting using BTEQ.I serached in google but could not find a good answer thus posting here. I am creating a '|' delemited file using BTEQ and encapsulating all the selected fields with "" in the output in UNIX environment. But not all the fields are getting exported to the output and getting truncate after column 75. PLease help.
Here is the below script that I am using.
.run file /u/users/analytics/.logons.txt;
.EXPORT report file = /u/analytics/data/item.dat;
.set recordmode off;
.set width 100000;
sELECT
top 10
'"' || trim(coalesce(cast(item_nbr AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(Old_nbr AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(Status AS VARCHAR(5)),'?'))||'"|"'||
trim(coalesce(cast(upc AS VARCHAR(30)),'?'))||'"|"'||
trim(coalesce(cast(cat_nbr AS VARCHAR(10)),'?'))||'"|"'||
trim(coalesce(cast(sub_cat_nbr AS VARCHAR(10)),'?'))||'"|"'||
trim(coalesce(cast(description AS VARCHAR (80)),'?'))||'"|"'||
trim(coalesce(cast(brand AS VARCHAR (80)),'?'))||'"|"'||
trim(coalesce(cast(pb_flag AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(manufacturer_name AS VARCHAR (80)),'?'))||'"|"'||
trim(coalesce(cast(distributor_name AS VARCHAR (80)),'?'))||'"|"'||
trim(coalesce(cast(pack_nbr AS VARCHAR(30)),'?'))||'"|"'||
trim(coalesce(cast(size_nbr AS VARCHAR(30)),'?'))||'"|"'||
trim(coalesce(cast(size_desc AS VARCHAR (6)),'?'))||'"|"'||
trim(coalesce(cast(likeitemgroup AS VARCHAR (100)),'?'))||'"|"'||
trim(coalesce(cast(linecode AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(Corelist AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(PricingReportExceptionItem AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(Automarkup AS VARCHAR(20)),'?'))||'"|"'||
trim(coalesce(cast(PricingSKU AS VARCHAR(15)),'?')) || '"'
from wm_ad_hoc.dim_itemattributes;
.EXPORT reset
.logoff;
Thanks,
stat
TPT Export Error TPT01403
Hello,
I am trying to export data from a Teradata table wm_ad_hoc.Table1 using TPT to a delemited file format in UNIX. That table has around 11 million rows and the export works fine using BTEQ.
But when used TPT, only 8 million rows got exported and then errored out with the message
"TPT_INFRA: TPT01403: Error: Unable to write data to the file (fileHandle: 3), System errno: 9 (Bad file descriptor)".
Please help me in resolvng this error.
Thanks,
stat
Downloads from downloads.teradata.com are blocked by Norton
I've tried to download TTU 15 Installer from https://downloads.teradata.com/download/tools/teradata-tools-and-utilities-windows-installation-package
which leads to d289lrf5tw1zls.cloudfront.net
Which is blocked by Norton:
http://safeweb.norton.com/report/show?url=d289lrf5tw1zls.cloudfront.net
DBSCONTROL dont display data
Hi, a few days i opened dbscontrol to view a flag, "GENERAL DISPLAY" and showed me the information correctly, but now that I try to enter does not show me anything, the command continues processing but never ends.
The other commands continue works (MODIFY, QUIT, HELP), DISPLAY command only don't works.
I checked running processes and there is none "ps aux | grep dbscontrol".
thx in advanced
Facing TPT error while lading data from flat file to Teradata
Hi,
Below is my TPT script to load data from flat file to Teradata 15.00.
DEFINE JOB LOAD_TPT_TEST2
DESCRIPTION 'LOAD A TERADATA TABLE FROM A FLAT FILE'
(
DEFINE SCHEMA FILESCHEMA
(
EID INTEGER,
EMPNM VARCHAR(10)
);
DEFINE OPERATOR DATACONN_C2
TYPE DATACONNECTOR PRODUCER
SCHEMA FILESCHEMA
ATTRIBUTES
(
VARCHAR PrivateLogName = 'STG_DB.TPT_TEST.log',
VARCHAR DirectoryPath ='/home/richa/',
VARCHAR FileName = 'testtpt.txt',
VARCHAR Format = 'Delimited',
VARCHAR TextDelimiter = ',',
VARCHAR OpenMode = 'read',
);
DEFINE OPERATOR INSERT_TPT_TEST2
TYPE INSERTER
SCHEMA *
ATTRIBUTES
(
VARCHAR PrivateLogName = 'STG_DB.TEST_Space.log',
VARCHAR TdpId = '127.0.0.1',
VARCHAR UserName = 'dbc',
VARCHAR UserPassword = 'dbc',
VARCHAR TargetTable = 'RETAIL.TARGET_EMP_TABLE',
VARCHAR LogTable = 'retail.STG_DB.TPT_TEST_L',
VARCHAR ErrorTable1 = 'retail.STG_DB.TPT_TEST_E1',
VARCHAR ErrorTable2 = 'retail.STG_DB.TPT_TEST_E2',
VARCHAR WorkTable = 'retail.STG_DB.TPT_TEST_WT'
);
APPLY
('INSERT INTO RETAIL.TARGET_EMP_TABLE(:EID,:EMPNM);')
TO OPERATOR (INSERT_TPT_TEST2[])
SELECT
EID,
EMPNM
FROM OPERATOR
(DATACONN_C2[]);
);
Below error I am facing:
Teradata Parallel Transporter Version 15.00.00.00
TPT_INFRA: Syntax error at or near line 21 of Job Script File 'tptscrff.tp':
TPT_INFRA: At ")" missing { ARRAY_ BIGINT_ BYTEINT_ CHARACTER_ CHAR_ CHARACTERS_ CHARS_ INT_ INTEGER_ LONG_ SMALLINT_ VARCHAR_ VARDATE_ REGULAR_IDENTIFIER_ EXTE NDED_IDENTIFIER_ EXTENDED_IDENTIFIER_NO_N_ } in Rule: Attribute Definition
Compilation failed due to errors. Execution Plan was not generated.
Job script compilation failed.
Job terminated with status 8