Hi
I am getting error while trying to run a fastload script. The error says " FDL4822 DEFINE statement rejected".
Below is the script:
.logon abcd/passwd;
.set record vartext "";
create table abc
(
eno. interger,
ename varchar(20),
dno integer,
sal decimal(10,2)
)
unique primary index(eno);
.DEFINE
eno (varchar(20)),
ename (varchar(20)),
dno (varchar(20)),
sal (varchar(20));
.BEGIN LOADING abc
.ERRORFILES
emp_err1,emp_err2
.CHECKPOINT 10000
File = file path
show;
insert into abc
(
:eno,
:ename,
:dno,
:sal
);
.END LOADING;
.logoff;
Fast load Error
TPT - Scripts from 13.10 doesn't work with 15.10 using named pipes
Hi,
I'm was using TTU 13.10 until now, when I upgraded my production machine with TTU 15.00.
But now I'm getting this error:
TPT19435 pmRead failed. EOF encountered before end of record (35)
The process that load data to the named pipes doesn't have time to connect to the pipes.
When I uninstalled 15.00 and installed 13.10 again everything back to normal.
I also tryed to run tbuild 15.00 using the np_AXSMOD.dll from the 13.10 package, but result was the same.
I can't find anything wrong in the docs.
Here's the dataconnector producer:
DEFINE OPERATOR PIPE_READER1() DESCRIPTION 'Define opcoes de leitura de arquivos' TYPE DATACONNECTOR PRODUCER SCHEMA LOAD_AS_IS_TEST_SCHEMA ATTRIBUTES ( VARCHAR AccessModuleName = 'np_axsmod.dll' , VARCHAR AccessModuleInitStr , VARCHAR FileName = '\\.\pipe\LoadAsIs_1' , VARCHAR Format = 'TEXT' , VARCHAR IndicatorMode = 'N' , VARCHAR OpenMode = 'Read' , VARCHAR RowErrFileName = @errorFilePath , INTEGER MaxSessions = @maxExportSessions , INTEGER MinSessions = @minExportSessions );
Log (using 13.10):
15:24:07 INFO > Teradata Parallel Transporter Version 13.10.00.02 15:24:07 INFO > Job log: D:\Programacao\Java\Projetos\teradata-loader\src\test\resources\temp/LoadAsIs-1.out 15:24:07 ERROR> WARN:Failed to lookup account administrators 15:24:07 ERROR> WARN:Failed to lookup account administrators 15:24:07 INFO > Job id is LoadAsIs-1, running on DragonZord 15:24:10 INFO > Teradata Parallel Transporter SQL DDL Operator Version 13.10.00.02 15:24:10 INFO > PREPARES_LOAD: private log not specified 15:24:13 INFO > PREPARES_LOAD: connecting sessions 15:24:14 INFO > PREPARES_LOAD: sending SQL requests 15:24:16 INFO > PREPARES_LOAD: TPT10508: RDBMS error 3807: Object 'DCT_PROD_TDM.LOAD_AS_IS_TEST' does not exist. 15:24:16 INFO > PREPARES_LOAD: TPT18046: Warning: error is ignored as requested in ErrorList 15:24:16 INFO > PREPARES_LOAD: disconnecting sessions 15:24:17 INFO > PREPARES_LOAD: Total processor time used = '0.140401 Second(s)' 15:24:17 INFO > PREPARES_LOAD: Start : Mon Aug 11 15:24:10 2014 15:24:17 INFO > PREPARES_LOAD: End : Mon Aug 11 15:24:17 2014 15:24:17 INFO > Job step PREPARES_LOAD_LOAD_AS_IS_TEST completed successfully 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER4: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > Teradata Parallel Transporter Load Operator Version 13.10.00.02 15:24:26 INFO > DATA_LOAD: private log not specified 15:24:26 INFO > PIPE_READER5: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER3: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > PIPE_READER2: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER8: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > PIPE_READER6: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER7: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER1: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > PIPE_READER1: TPT19003 ECI operator ID: PIPE_READER1-6656 15:24:26 INFO > PIPE_READER4: TPT19003 ECI operator ID: PIPE_READER4-2596 15:24:26 INFO > PIPE_READER7: TPT19003 ECI operator ID: PIPE_READER7-6900 15:24:26 INFO > PIPE_READER2: TPT19003 ECI operator ID: PIPE_READER2-6128 15:24:26 INFO > PIPE_READER5: TPT19003 ECI operator ID: PIPE_READER5-3504 15:24:26 INFO > PIPE_READER8: TPT19003 ECI operator ID: PIPE_READER8-5456 15:24:26 INFO > PIPE_READER6: TPT19003 ECI operator ID: PIPE_READER6-5728 15:24:26 INFO > PIPE_READER3: TPT19003 ECI operator ID: PIPE_READER3-3396 15:24:28 INFO > DATA_LOAD: connecting sessions 15:24:37 INFO > PIPE_READER2: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_2'. 15:24:37 INFO > PIPE_READER1: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_1'. 15:24:37 INFO > PIPE_READER5: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_5'. 15:24:37 INFO > PIPE_READER3: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_3'. 15:24:37 INFO > PIPE_READER8: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_8'. 15:24:37 INFO > PIPE_READER6: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_6'. 15:24:37 INFO > PIPE_READER4: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_4'. 15:24:37 INFO > PIPE_READER7: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_7'. 15:24:39 INFO > DATA_LOAD: preparing target table 15:24:40 INFO > DATA_LOAD: entering Acquisition Phase 15:24:49 INFO > DATA_LOAD: entering Application Phase 15:24:50 INFO > DATA_LOAD: Statistics for Target Table: 'DCT_PROD_TDM.LOAD_AS_IS_TEST' 15:24:50 INFO > DATA_LOAD: Total Rows Sent To RDBMS: 3352 15:24:50 INFO > DATA_LOAD: Total Rows Applied: 3352 15:24:52 INFO > DATA_LOAD: disconnecting sessions 15:24:52 INFO > PIPE_READER4: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER7: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER8: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER6: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER3: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER2: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER5: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER1: TPT19221 Total files processed: 1. 15:24:57 INFO > DATA_LOAD: Total processor time used = '0.405603 Second(s)' 15:24:57 INFO > DATA_LOAD: Start : Mon Aug 11 15:24:26 2014 15:24:57 INFO > DATA_LOAD: End : Mon Aug 11 15:24:57 2014 15:24:57 INFO > Job step LOAD_LOAD_AS_IS_TEST completed successfully 15:24:57 INFO > Job LoadAsIs completed successfully
Log (using 15.00):
14:24:54 INFO > Teradata Parallel Transporter Version 15.00.00.00 14:24:54 INFO > Job log: D:\Programacao\Java\Projetos\teradata-loader\src\test\resources\temp/LoadAsIs-42.out 14:24:54 ERROR> WARN:Failed to lookup account administrators 14:24:54 ERROR> WARN:Failed to lookup account administrators 14:24:54 INFO > Job id is LoadAsIs-42, running on DragonZord 14:24:57 INFO > Teradata Parallel Transporter SQL DDL Operator Version 15.00.00.00 14:24:57 INFO > PREPARES_LOAD: private log not specified 14:24:59 INFO > PREPARES_LOAD: connecting sessions 14:25:00 INFO > PREPARES_LOAD: The RDBMS retryable error code list was not found 14:25:00 INFO > PREPARES_LOAD: The job will use its internal retryable error codes 14:25:00 INFO > PREPARES_LOAD: sending SQL requests 14:25:07 INFO > PREPARES_LOAD: disconnecting sessions 14:25:08 INFO > PREPARES_LOAD: Total processor time used = '0.0936006 Second(s)' 14:25:08 INFO > PREPARES_LOAD: Start : Mon Aug 11 14:24:57 2014 14:25:08 INFO > PREPARES_LOAD: End : Mon Aug 11 14:25:08 2014 14:25:08 INFO > Job step PREPARES_LOAD_LOAD_AS_IS_TEST completed successfully 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER2[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER6[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER7[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER3[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER2[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-5760-1'. 14:25:18 INFO > PIPE_READER6[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-4060-1'. 14:25:18 INFO > PIPE_READER7[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-5348-1'. 14:25:18 INFO > PIPE_READER3[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-3300-1'. 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER5[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER2[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER2[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER7[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER7[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER8[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER5[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-1488-1'. 14:25:18 INFO > PIPE_READER6[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER3[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER6[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER3[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER5[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER5[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER8[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-1196-1'. 14:25:18 INFO > PIPE_READER8[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER8[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER4[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter Load Operator Version 15.00.00.00 14:25:18 INFO > DATA_LOAD: private log not specified 14:25:18 INFO > PIPE_READER4[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-6988-1'. 14:25:18 INFO > PIPE_READER4[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER4[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER6[1]: TPT19003 ECI operator ID: 'PIPE_READER6-4060' 14:25:18 INFO > PIPE_READER5[1]: TPT19003 ECI operator ID: 'PIPE_READER5-1488' 14:25:18 INFO > PIPE_READER8[1]: TPT19003 ECI operator ID: 'PIPE_READER8-1196' 14:25:18 INFO > PIPE_READER7[1]: TPT19003 ECI operator ID: 'PIPE_READER7-5348' 14:25:18 INFO > PIPE_READER2[1]: TPT19003 ECI operator ID: 'PIPE_READER2-5760' 14:25:18 INFO > PIPE_READER3[1]: TPT19003 ECI operator ID: 'PIPE_READER3-3300' 14:25:18 INFO > PIPE_READER6[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_6'. 14:25:18 INFO > PIPE_READER8[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_8'. 14:25:18 INFO > PIPE_READER7[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_7'. 14:25:18 INFO > PIPE_READER3[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_3'. 14:25:18 INFO > PIPE_READER2[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_2'. 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER1[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER1[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-5768-1'. 14:25:18 INFO > PIPE_READER4[1]: TPT19003 ECI operator ID: 'PIPE_READER4-6988' 14:25:18 INFO > PIPE_READER1[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER1[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER5[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_5'. 14:25:18 INFO > PIPE_READER4[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_4'. 14:25:18 INFO > PIPE_READER1[1]: TPT19003 ECI operator ID: 'PIPE_READER1-5768' 14:25:18 INFO > PIPE_READER1[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_1'. 14:25:21 INFO > DATA_LOAD: connecting sessions 14:25:21 INFO > DATA_LOAD: The RDBMS retryable error code list was not found 14:25:21 INFO > DATA_LOAD: The job will use its internal retryable error codes 14:25:52 INFO > DATA_LOAD: preparing target table 14:25:58 INFO > DATA_LOAD: entering Acquisition Phase 14:26:01 INFO > PIPE_READER1[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER2[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER3[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER1[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER2[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER3[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER1[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER2[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER3[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER4[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER5[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER4[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER5[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER4[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER6[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER5[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER6[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER6[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER7[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER8[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER7[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER8[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER7[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER8[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > DATA_LOAD: disconnecting sessions 14:26:01 INFO > PIPE_READER2[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER3[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER5[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER6[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER4[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER1[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER8[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER7[1]: TPT19221 Total files processed: 0. 14:26:18 INFO > DATA_LOAD: Total processor time used = '0.312002 Second(s)' 14:26:18 INFO > DATA_LOAD: Start : Mon Aug 11 14:25:18 2014 14:26:18 INFO > DATA_LOAD: End : Mon Aug 11 14:26:18 2014 14:26:18 INFO > Job step LOAD_LOAD_AS_IS_TEST terminated (status 12) 14:26:18 INFO > Job LoadAsIs terminated (status 12) 14:26:18 INFO > Job start: Mon Aug 11 14:24:54 2014 14:26:18 INFO > Job end: Mon Aug 11 14:26:18 2014
Sorry for the long post.
Thanks for the help.
TPT export timeout issue
A TPT extract that creates a very large extract (about 100GB uncompressed size) times out in approximately 1 hour after completing about 40% of the extract. This has happened repeatedly with different number of instances and different buffer sizes. This is the largest extract in our system. We didn't have this issue with any other extracts.
The error is:
EXPORT_OPERATOR: sending SELECT request
EXPORT_OPERATOR: TPT10508: RDBMS error 2594: One of the FastExport session has been logged off
EXPORT_OPERATOR: TPT10508: RDBMS error 2594: One of the FastExport session has been logged off
EXPORT_OPERATOR: disconnecting sessions
Any idea how to fix this? The database timeout is set to 20 minutes. Even changing the database timeout to 60 minutes didn't solve the problem.
Any thoughts/suggestions?
Teradata to Oracle data transfer
Currenty I have BTEQ scripts written to move data from schema1 to schema2. However there is a plan to move schema2 to Oracle DB from Teradata. Can BTEQ scripts be used to move data from Teradata to Oracle. OR what is the best alternate for me to achive this please?
Thanks in advance,
Cheers
TPT scripts
Currently I have TPT scripts which transfers data from schema1 (Teradata box1) to schema2 (Teradata box2). There is a plan to move the contents of Teradata Box2 to Oracle.
Will the TPT scripts written can be reused to tranfer data to Oracle. If so does it require any major changes to be done to make it work for Oracle. If not what is the alternate way?
Thanks,
TPT_INFRA TPT01057 Error Insufficient main storage for attempted allocation
Hi guys,
I've no ideia of what could be the cause of this error. On TD Docs (http://www.info.teradata.com/htmlpubs/DB_TTU_14_10/index.html#page/General_Reference/B035_1096_112K/TPT.41.1267.html), the explanation doesn't say how to solve the problem.
The thing is: I wrote a vbs script to generate a TPT script and a amj file to migrate a entire SQLServer database to TD. After several adjustements, I've reached a script that works very well, but it can generate pretty big tpt scripts. The one that is causing me the problem has almost 14k lines. Although the script is big, it is simple. Basicly, a DDL Operator, then, one schema per table. After, one Dataconector Producer (access module OLEDB_AXSMOD) per table (each one references a job in the .amj file) and at last, one Load Operator per table. Toghether with the script, I also create a amj file with one job per table (acctually, a select statement wich I use to do some transformations).
On the steps, first I call the DDL Operator with some DDL Instructions to drop and create the stage tables. Then, the sptes to pull the data from SQLServer to TD (one per table) and push into TD, and at the end, a last call to the DDL Operator to rename the normal tables to backup tables, and the stage tables to the official tables.
As I've said before, the generated script can get a little big, with hundreds of steps. Then, with this script with 257 steps (apart from the initial and final step), when it gets about the 90th I get the following message: "TPT_INFRA TPT01057 Error Insufficient main storage for attempted allocation".
That is the end. Tpt hangs and I can't get anywere from there. There isn't a exactly point where it hangs. It can be at the 90th, 91th, 92th and etc. If I ctrl+c the job e reissue the tbuild command, Tpt processes one more step and again issues the TPT01057 message. The tpt infra doesn't responde to commands like twbcmd <job> resume
nor twbcmd <job> pause
(still it says that is processing the command). The twbstat
says that the job is running.
One important detail is: this only happens between the end of a step and the start of the next step. In other words, it doesn't happens in the middle of a step.
This is one example of the output from tbuild:
Job step step_91 completed successfully Teradata Parallel Transporter src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19006 Version 14.10.00.01 src_operator_ativa_ModeloPapelTrabalho_TipoOs Instance 1 directing private log report to 'producer_log-1'. src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19003 NotifyMethod: 'None (default)' src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19008 DataConnector Producer operator Instances: 1 Teradata Parallel Transporter Load Operator Version 14.10.00.01 load_operator_ativa_ModeloPapelTrabalho_TipoOs: private log specified: load_log_ativa_ModeloPapelTrabalho_TipoOs src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19003 ECI operator ID: src_operator_ativa_ModeloPapelTrabalho_TipoOs-4624 load_operator_ativa_ModeloPapelTrabalho_TipoOs: connecting sessions src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19222 Operator instance 1 processing file 'corp_ativa_acesso.amj'. load_operator_ativa_ModeloPapelTrabalho_TipoOs: preparing target table load_operator_ativa_ModeloPapelTrabalho_TipoOs: entering Acquisition Phase load_operator_ativa_ModeloPapelTrabalho_TipoOs: entering Application Phase load_operator_ativa_ModeloPapelTrabalho_TipoOs: Statistics for Target Table: 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTra balho_TipoOs' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Sent To RDBMS: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Applied: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 1: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 2: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Duplicate Rows: 0 src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19221 Total files processed: 1. load_operator_ativa_ModeloPapelTrabalho_TipoOs: disconnecting sessions load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total processor time used = '1.46641 Second(s)' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Start : Mon Aug 18 18:53:19 2014 load_operator_ativa_ModeloPapelTrabalho_TipoOs: End : Mon Aug 18 18:53:30 2014 Job step step_92 completed successfully TPT_INFRA: TPT01057: Error: Insufficient main storage for attempted allocation
As you can see, after TPT completed the 92th step, it hanged. From tlogview:
Task(SELECT_2[0001]): checkpoint completed, status = Success Task(APPLY_1[0004]): checkpoint completed, status = Success Task(APPLY_1[0003]): checkpoint completed, status = Success Task(APPLY_1[0001]): checkpoint completed, status = Success Task(APPLY_1[0002]): checkpoint completed, status = Success load_operator_ativa_ModeloPapelTrabalho_TipoOs: entering Application Phase load_operator_ativa_ModeloPapelTrabalho_TipoOs: Statistics for Target Table: 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTra balho_TipoOs' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Sent To RDBMS: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Applied: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 1: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 2: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Duplicate Rows: 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 0, Total Rows Sent = 0 src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19221 Total files processed: 1. load_operator_ativa_ModeloPapelTrabalho_TipoOs: disconnecting sessions load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total processor time used = '1.46641 Second(s)' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Start : Mon Aug 18 18:53:19 2014 load_operator_ativa_ModeloPapelTrabalho_TipoOs: End : Mon Aug 18 18:53:30 2014 Job step step_92 completed successfully TPT_INFRA: TPT01057: Error: Insufficient main storage for attempted allocation TPT_INFRA: TPT02813: Error: Failed to create the Job variable Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Coordinator Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01
From the tpt script, here is a small extract (only things related with the step 92 and 93). Again, as I've said before, it hangs after the 90th, and not necessarily between the 91th and 92th.
DEFINE JOB FILE_LOAD DESCRIPTION 'Job corp_ativa_acesso' ( DEFINE OPERATOR DDL_Operator TYPE DDL ATTRIBUTES ( VARCHAR PrivateLogName = 'ddl_log', VARCHAR UserName = 'DBADMIN', VARCHAR UserPassword = '*********', VARCHAR ARRAY ErrorList = ['3807','3803'], VARCHAR TdpId = 'maxcgu01-1-1' ); DEFINE SCHEMA schema_ativa_ModeloPapelTrabalho_TipoOs ( "IdModeloPapelTrabalho_TipoOs" INTEGER,"IdModeloPapelTrabalho" INTEGER,"IdTipoOs" INTEGER,"BolObrigatorio" SMALLINT,"IdPerfil" INTEGER ); DEFINE SCHEMA schema_ativa_ModeloPapelTrabalhoVinculado ( "IdModeloPapelTrabalhoVinculado" INTEGER,"IdModeloPapelTrabalhoA" INTEGER,"IdModeloPapelTrabalhoB" INTEGER ); DEFINE OPERATOR src_operator_ativa_ModeloPapelTrabalho_TipoOs DESCRIPTION 'Odbc Operator para tabela [ativa].[ModeloPapelTrabalho_TipoOs]' TYPE DATACONNECTOR PRODUCER SCHEMA schema_ativa_ModeloPapelTrabalho_TipoOs ATTRIBUTES ( VARCHAR AccessModuleName = 'OLEDB_AXSMOD', VARCHAR FileName = 'corp_ativa_acesso.amj', VARCHAR Format = 'Formatted', VARCHAR AccessModuleInitStr = 'noprompt jobid=92', VARCHAR OpenMode = 'Read', VARCHAR EnableScan = 'No', VARCHAR IndicatorMode = 'Yes', VARCHAR PrivateLogName = 'producer_log' ); DEFINE OPERATOR src_operator_ativa_ModeloPapelTrabalhoVinculado DESCRIPTION 'Odbc Operator para tabela [ativa].[ModeloPapelTrabalhoVinculado]' TYPE DATACONNECTOR PRODUCER SCHEMA schema_ativa_ModeloPapelTrabalhoVinculado ATTRIBUTES ( VARCHAR AccessModuleName = 'OLEDB_AXSMOD', VARCHAR FileName = 'corp_ativa_acesso.amj', VARCHAR Format = 'Formatted', VARCHAR AccessModuleInitStr = 'noprompt jobid=93', VARCHAR OpenMode = 'Read', VARCHAR EnableScan = 'No', VARCHAR IndicatorMode = 'Yes', VARCHAR PrivateLogName = 'producer_log' ); DEFINE OPERATOR load_operator_ativa_ModeloPapelTrabalho_TipoOs TYPE LOAD SCHEMA schema_ativa_ModeloPapelTrabalho_TipoOs ATTRIBUTES( VARCHAR PrivateLogName = 'load_log_ativa_ModeloPapelTrabalho_TipoOs', VARCHAR UserName = 'DBADMIN', VARCHAR UserPassword = '*****', VARCHAR TdpId = 'maxcgu01-1-1', VARCHAR TargetTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs', VARCHAR LogTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_log', VARCHAR ErrorTable1 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_1', VARCHAR ErrorTable2 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_2' ); DEFINE OPERATOR load_operator_ativa_ModeloPapelTrabalhoVinculado TYPE LOAD SCHEMA schema_ativa_ModeloPapelTrabalhoVinculado ATTRIBUTES( VARCHAR PrivateLogName = 'load_log_ativa_ModeloPapelTrabalhoVinculado', VARCHAR UserName = 'DBADMIN', VARCHAR UserPassword = '******', VARCHAR TdpId = 'maxcgu01-1-1', VARCHAR TargetTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado', VARCHAR LogTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_log', VARCHAR ErrorTable1 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_1', VARCHAR ErrorTable2 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_2' ); STEP step_inicial_1 ( APPLY ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_2'), ('create table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs( IdModeloPapelTrabalho_TipoOs INTEGER,IdModeloPapelTrabalho INTEGER,IdTipoOs INTEGER,BolObrigatorio SMALLINT,IdPerfil INTEGER )UNIQUE PRIMARY INDEX(IdModeloPapelTrabalho_TipoOs)') , ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_2'), ('create table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado( IdModeloPapelTrabalhoVinculado INTEGER,IdModeloPapelTrabalhoA INTEGER,IdModeloPapelTrabalhoB INTEGER )UNIQUE PRIMARY INDEX(IdModeloPapelTrabalhoVinculado)') TO OPERATOR (DDL_Operator); ); STEP step_92 ( APPLY ('Insert into Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs ("IdModeloPapelTrabalho_TipoOs","IdModeloPapelTrabalho","IdTipoOs","BolObrigatorio","IdPerfil") values (:"IdModeloPapelTrabalho_TipoOs",:"IdModeloPapelTrabalho",:"IdTipoOs",:"BolObrigatorio",:"IdPerfil")') TO OPERATOR (load_operator_ativa_ModeloPapelTrabalho_TipoOs[4]) SELECT "IdModeloPapelTrabalho_TipoOs","IdModeloPapelTrabalho","IdTipoOs","BolObrigatorio","IdPerfil" FROM OPERATOR (src_operator_ativa_ModeloPapelTrabalho_TipoOs); ); STEP step_93 ( APPLY ('Insert into Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado ("IdModeloPapelTrabalhoVinculado","IdModeloPapelTrabalhoA","IdModeloPapelTrabalhoB") values (:"IdModeloPapelTrabalhoVinculado",:"IdModeloPapelTrabalhoA",:"IdModeloPapelTrabalhoB")') TO OPERATOR (load_operator_ativa_ModeloPapelTrabalhoVinculado[4]) SELECT "IdModeloPapelTrabalhoVinculado","IdModeloPapelTrabalhoA","IdModeloPapelTrabalhoB" FROM OPERATOR (src_operator_ativa_ModeloPapelTrabalhoVinculado); ); STEP step_final_1 ( APPLY ('drop table Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs_bkp;'), ('rename table Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs to Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs_bkp;'), ('rename table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs to Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_2') , ('drop table Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado_bkp;'), ('rename table Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado to Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado_bkp;'), ('rename table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado to Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_2') TO OPERATOR (DDL_Operator); ); );
And at last, a piece of the .amj file:
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <?OLE_DB_AXSMOD_FirstCompatibleVersion 14.00.00.00?> <!--Configuration information for the OLE DB AXSMOD--> <OLE_DB_AXSMOD_Jobs> <Job Id="92"> <Source> <DataSourceParseName>{397C2819-8272-4532-AD3A-FB5E43BEAA39}<!--SQL Server Native Client 11.0 (SQLNCLI11)--> </DataSourceParseName> <DataSourceProperties> <PropertySet> ... </PropertySet> </DataSourceProperties> <TableCommand>SELECT [IdModeloPapelTrabalho_TipoOs],[IdModeloPapelTrabalho],[IdTipoOs],convert(smallint,[BolObrigatorio]) as [BolObrigatorio],[IdPerfil] FROM [ativa].[ModeloPapelTrabalho_TipoOs]</TableCommand> <Columns> <Column><Selected/><SourceName>IdModeloPapelTrabalho_TipoOs</SourceName><DestinationName>IdModeloPapelTrabalho_TipoOs</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdModeloPapelTrabalho</SourceName><DestinationName>IdModeloPapelTrabalho</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdTipoOs</SourceName><DestinationName>IdTipoOs</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>BolObrigatorio</SourceName><DestinationName>BolObrigatorio</DestinationName><TypeName>SMALLINT</TypeName></Column> <Column><Selected/><SourceName>IdPerfil</SourceName><DestinationName>IdPerfil</DestinationName><TypeName>INTEGER</TypeName></Column> </Columns> <LocationOfLogTables>0<!--User's default database 0, Source database 1, Other database 2--> </LocationOfLogTables> <OtherDatabase/> <CharDataTransferUTF8>1<!--OldMethod 0, NewMethod 1--> </CharDataTransferUTF8> </Source> <CharacterEncoding>ASCII</CharacterEncoding> <CheckpointInterval/> <LargeDecimalSupport>Supported</LargeDecimalSupport> <RowsPerFetch>15000</RowsPerFetch> <BufferSize/> </Job> <Job Id="93"> <Source> <DataSourceParseName>{397C2819-8272-4532-AD3A-FB5E43BEAA39}<!--SQL Server Native Client 11.0 (SQLNCLI11)--> </DataSourceParseName> <DataSourceProperties> <PropertySet> ... </PropertySet> </DataSourceProperties> <TableCommand>SELECT [IdModeloPapelTrabalhoVinculado],[IdModeloPapelTrabalhoA],[IdModeloPapelTrabalhoB] FROM [ativa].[ModeloPapelTrabalhoVinculado]</TableCommand> <Columns> <Column><Selected/><SourceName>IdModeloPapelTrabalhoVinculado</SourceName><DestinationName>IdModeloPapelTrabalhoVinculado</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdModeloPapelTrabalhoA</SourceName><DestinationName>IdModeloPapelTrabalhoA</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdModeloPapelTrabalhoB</SourceName><DestinationName>IdModeloPapelTrabalhoB</DestinationName><TypeName>INTEGER</TypeName></Column> </Columns> <LocationOfLogTables>0<!--User's default database 0, Source database 1, Other database 2--> </LocationOfLogTables> <OtherDatabase/> <CharDataTransferUTF8>1<!--OldMethod 0, NewMethod 1--> </CharDataTransferUTF8> </Source> <CharacterEncoding>ASCII</CharacterEncoding> <CheckpointInterval/> <LargeDecimalSupport>Supported</LargeDecimalSupport> <RowsPerFetch>15000</RowsPerFetch> <BufferSize/> </Job> </OLE_DB_AXSMOD_Jobs>
Any ideas?
SQL Assistant v.15.00 Keyboard Shortcut problems - Comment
I just installed the new TTU 15.00 SQL Assistant, but I having problems with the keyboard shortcuts. In previous versions the (CTRL + D) turned text to and from comments. If no text selected then the line with the curser was transformed to/from single line comment with “--“, if block of text was selected then it was transformed to/from comment block with “/* */”.
In SQL Assistant 15.00 then (CTRL + D) is not defined at all.
I found in the Customize Keyboard menu, Category: Query, then Comment: Comment, I have manually assigned this to (CTRL + D). Now I am able to change a Comment block on/off. But when no text is selected then the single line comment version are not working.
How do I make the (CTRL + D) to handle both single line comment and block comment, like in the previous versions of SQL Assistant?
Peter Schwennesen
SQL Assistant v.15.00 Keyboard Shortcut problems - Comment
I just installed the new TTU 15.00 SQL Assistant, but I having problems with the keyboard shortcuts. In previous versions the (CTRL + D) turned text to and from comments. If no text selected then the line with the curser was transformed to/from single line comment with “--“, if block of text was selected then it was transformed to/from comment block with “/* */”.
In SQL Assistant 15.00 then (CTRL + D) is not defined at all.
I found in the Customize Keyboard menu, Category: Query, then Comment: Comment, I have manually assigned this to (CTRL + D). Now I am able to change a Comment block on/off. But when no text is selected then the single line comment version are not working.
How do I make the (CTRL + D) to handle both single line comment and block comment, like in the previous versions of SQL Assistant?
Peter Schwennesen
Bug in SQL Assistant 15.00
I think you have introduced a bug in SQL Assistant 15.00, changes I am making to the setup are not maintained after a restart.
The (CTRL+D) to toggle the comment on/off are not predefined. Therefore I am manually doing this setup.
From the Tools menu I am selecting the: Customize. In the Customize window I am selecting “Keyboard…” In the Customize Keyboard menu, I am selecting the Query entry in the Categories, and from the Commands I am selecting the Comment entry. Hereafter I am specifying then (CTRL+D) from the Specify a Shortcut dropdown box. Finally I am pressing the Assign button. Then I close down all menus.
Back in the Query editor window, the comment (Block comment only!!) is now working by pressing the (CTRL+D) on the keyboard.
BUT!! BUT!! BUT!! After shutting down SQL Assistant and restarting it, the setting I just performed has been deleted! And to have the (CTRL+D) to work again I have to perform the setup again. I have now tried it several times and every time SQL Assistant is closed the setup I just made has been deleted! This cannot be the right way for the software to work.
I have looked in the folder where SQL Assistant EXE file are located, but I have not been able to find any setup files where I manually can defined the shortcut!
Please help!!! If there is a BUG then please fix the BUG and send me a new TTU 15!! If I am doing something wrong then please guide me to how to make the setup change right!!
Peter Schwennesen
PS: the TTU14 that I was asked to uninstall before been able to install the TTU15, deleted all TTU14 software and also all SQL Assistant 13 software that I have manually installed.
How to read/retain the Western European characters(Windows-1252)
Our manufacturing client, uses an Informatica ETL tool to extract the data from the SQL Server through ODBC driver setting the IANAAppCodePage to "Western European (WIndows-1252)" and loads the data to Teradata DW. In this method, the extraction and loading of the data preserves the Windows-1252 characters. Data is perfectly matching when validated between these 2 systems.
I am trying to emulate the ETL functionality through Teradata PT by using the ODBC operator to connect & extract the data from SQL Server which uses the Progress Datadirect SQL Server Driver and loading to data to Teradata DW using Load Operator with the character set chosen to be the default for the network- attached client system.
In this method, the some of the original Windows-1252 characters produced by the SQL Server is getting transformed/converted into an UTF-8 encoding and this method invalidates the data when tested between both the systemsHence, we expect suggestions of the best method/option to be considered for this ETL functionality such that the original Windows-1252 characters are preserved through TPT.
Exporting to Multiple files in Fast Export
Hi All,
Can we export the output of multiple select statements specified with in .begin export and .end export to multiple outfiles ?
If we create multiple .begin export and .end export blocks with in a single fast export job, are they executed in parallel or sequencially ?
Thanks
ambuj
Fastload to non-empty table
Hi,
What is the exact reason we can fastload to non-empty tables. I know we cant update the existing tables but what stops in just inserting to the non-empty table.
Regards,
Am
Error Starting SQL Assistant v14.10.0.4
I cannot start SQL Assistant, and the error message is:
System.IO.DirectoryNotFoundException: Could not find special directory 'My Documents'.
I KNOW my My Documents folder is broken (was re-directed to a network share that is no longer available), but still need to be operational.
How can I "reset" SQL Assistant? or start it w/a different config, clean config, break it's need to find My Documents?
This seems pretty fragile...
Teradata Bteq Optimization
Hi All,
We have a requirement to apply optimization to the exting Bteq Scripts.
What are all possible scope of optimization in a Bteq ?
Is there any best practices document or coding standard guidelines in Teradata Official documentations ?
Thanks,
Ambuj
TPT Extract - Newline character in data causes number of records mismatch
I have written a script which will generate tpt export script based on the parameters passed.
Since I am generating this script it has to be generic in nature it works for all kinds of table and data type.
I am successful in doing this. In order to remove the new line character I am using oreplace function in my select query of tpt for all the varchar columns which is making this query very costly.
Is there a way I can handle newline character without this oreplace function.
Please let me know if we have any other work for this.
USING CHARACTER SET ASCII
DEFINE JOB ec_rgtry_rlatnp_dat_extract
DESCRIPTION 'export ec_rgtry_rlatnp_dat_Job'
(
DEFINE SCHEMA SCHEMA_ec_rgtry_rlatnp_dat_extract (
RGTRY_RLATNP_ID VARCHAR(40),
RGTRY_RLATNP_ASOCN_TYPE_CD VARCHAR(4),
SOR_ID VARCHAR(10),
ENT_CUST_ID VARCHAR(38)
);
DEFINE OPERATOR EXPORT_OPERATOR
TYPE EXPORT
SCHEMA SCHEMA_ec_rgtry_rlatnp_dat_extract
ATTRIBUTES (
UserName='XXXX',
UserPassword='YYYYY',
TdpId='DB',
MaxSessions=1,
MinSessions=1,
SpoolMode='NoSpool',
VARCHAR DateForm = 'ANSIDATE',
SelectStmt = 'SELECT
trim((OREPLACE(RGTRY_RLATNP_ID,x''0A'','''')) (VARCHAR(40))),
trim((OREPLACE(RGTRY_RLATNP_ASOCN_TYPE_CD,x''0A'','''')) (VARCHAR(4))),
TRIM(SOR_ID),
trim(((ENT_CUST_ID) (BIGINT) (VARCHAR(20))))
FROM DB.EC_RGTRY_RLATNP_PT
;',
VARCHAR ReportModeOn
);
DEFINE OPERATOR FILE_WRITER_ec_rgtry_rlatnp_dat_extract
TYPE DATACONNECTOR CONSUMER
SCHEMA SCHEMA_ec_rgtry_rlatnp_dat_extract
ATTRIBUTES (
FileName='20140505104506_abc.dat',
Format='DELIMITED',
TextDelimiter='|',
IndicatorMode='N',
OpenMode='Write'
);
APPLY TO OPERATOR (FILE_WRITER_ec_rgtry_rlatnp_dat_extract[1])
SELECT
RGTRY_RLATNP_ID ,
RGTRY_RLATNP_ASOCN_TYPE_CD ,
SOR_ID ,
ENT_CUST_ID
FROM OPERATOR (EXPORT_OPERATOR[1]);
);
Looking for command line versions Teradata Parallel Transporter (TD ->streaming->SQL)
Dear colleges,
I want to extract information form a Teradata environment and load it in SQL server ( I know this is not done :-P ). For now my process writs in a file, and then the file is load in SQLserver. Important to tell, I do not want to use SQLserver SSIS but TPT command line version!
So at this moment I use Teradata Parallel Transporter command line for the extraction process. Now I believe it should be possible to move the data streaming with TPT from Teradata to SQLserver and so skipping the file in between. This would save time and storage on file system.
I’m looking for examples, whom can help me with some command line version TPT scripts? For instance; I’m wondering if it’s possible to connect by trusted user to SQLserver? How to format the connection string,… etc.
We are using; Teradata 13.10 ( Teradata Parallel Transporter, command line version )
SQLserver 2008 R2 10.50.1600.1 (X64)
Below an extraction of the of the spool version command line TPT I use.
DEFINE JOB CDR_TPT_VW_EXPORT DESCRIPTION 'TPT EXPORT VAN DE VIEW DATA DAG CDRS' ( DEFINE OPERATOR OPR1_TPT_EXPORT DESCRIPTION 'DEFINES A SPECIFIC TPT OPERATOR TO BE USED IN THE JOB OPR1_TPT_EXPORT' TYPE DATACONNECTOR CONSUMER SCHEMA * ATTRIBUTES ( VARCHAR FILENAME, VARCHAR FORMAT, VARCHAR OPENMODE, VARCHAR INDICATORMODE, VARCHAR TEXTDELIMITER ); DEFINE SCHEMA SCH_TPT_EXPORT DESCRIPTION 'DEFINES THE COLUMNS AND THEIR DATA TYPES.' ( RECORD_GELADEN_IN_FACT VARCHAR(10), NETWORK_ACTIVITY_ID VARCHAR(18), .. . ( removed a lot of columns ;-) ) .. CDR_BRON VARCHAR(4) ); DEFINE OPERATOR OPR2_TPT_EXPORT DESCRIPTION 'DEFINES A SPECIFIC TPT OPERATOR TO BE USED IN THE JOB OPR2_TPT_EXPORT.' TYPE EXPORT SCHEMA SCH_TPT_EXPORT ATTRIBUTES ( VARCHAR USERNAME = @UsrID, VARCHAR USERPASSWORD = @Pwd, VARCHAR TDPID, VARCHAR SELECTSTMT, INTEGER BLOCKSIZE = 64330, INTEGER MAXSESSIONS = 126, INTEGER MINSESSIONS = 1, VARCHAR PRIVATELOGNAME = 'BXXXX_TPT_EXP_' || $JOBID || '.LOG', VARCHAR TRACELEVEL = 'All' ); APPLY 'VARCHAR QUERYBANDSESSINFO = UTILITYDATASIZE=LARGE;' TO OPERATOR ( OPR1_TPT_EXPORT[8] ATTRIBUTES ( FILENAME = '\\KLAN.LOCAL\SOURCES\TPT_CDRVW_DD.TXT', FORMAT = 'DELIMITED', OPENMODE = 'WRITE', INDICATORMODE = 'N', TEXTDELIMITER = '|' ) ) SELECT * FROM OPERATOR ( OPR2_TPT_EXPORT[8] ATTRIBUTES ( SELECTSTMT = 'SELECT RECORD_GELADEN_IN_FACT, .. . ( removed also lot of columns ;-) ) .. CDR_BRON FROM SANDBOX.VW_EXP_CDR_DD;', TDPID = 'database01' ) ); );
Any help is welkom!
Easy Loader Issue
I'm trying to load a csv file into a table using TPT Easy Loader.
The error I'm receiving is "Object Addresses2_RL does not exist." on the first run of the below script.
tdload -f "C:\addresses_table.csv" -t "Addresses2" -h tdat15 -u user1 -p *pw* --TargetWorkingDatabase "SQL_CLASS" -d ,
Addresses2_RL exists in the user1 database while it looks like _ET and _UV tables are created in the target working database SQL_CLASS.
How do I fix this issue?
Thank you for your help.
TPT: How to forcefully fail the job
Mload error: RDBMS failure: 2644, No more room in database abc
Hi,
I am getting "No more room in database" error whenever i am trying to run a mload script through unix. i tried running it in different database, still getting same error. Below is the script and log.
SCRIPT:
.LOGTABLE logtable001;
.LOGON tdp3/user2,tyler;
.BEGIN MLOAD TABLES abc;
.LAYOUT abc;
(
.FIELD in_eno integer
.FIELD in_ename varchar(20)
.FILLER in_dno interger
.FIELD in_sal decimal(10,2)
);
.DML LABEL PAYROLL;
insert into abc(eno,ename,dno,sal)
values(:in_eno, :in_ename, :in_dno,:in_sal);
.IMPORT xyz
LAYOUT abc
FORMAT VARtext ''
APPLY LABEL PAYROLL;
.END MLOAD;
.LOGOFF;
LOG:
0001 .LOGTABLE logtable001;
0002 .LOGON tdp3/user2,tyler;
**** 09:02:37 UTY8400 Teradata Database Release: 13.10.07.28b
**** 09:02:37 UTY8400 Teradata Database Version: 13.10.07.21
**** 09:02:37 UTY8400 Default character set: ASCII
**** 09:02:37 UTY8400 Current RDBMS has interval support
**** 09:02:37 UTY8400 Current RDBMS has UDT support
**** 09:02:37 UTY8400 Current RDBMS has TASM support
**** 09:02:37 UTY8400 Maximum supported buffer size: 1M
**** 09:02:37 UTY8400 Data Encryption supported by RDBMS server
**** 09:02:39 UTY1008 RDBMS failure: 2644, No more room in database abc.
**** 09:02:39 UTY2410 Total processor time used = '0.01 Seconds'
. Start : 09:02:37 - FRI AUG 29, 2014
. End : 09:02:39 - FRI AUG 29, 2014
. Highest return code encountered = '12'.
MF tpt failing all of the sudden.
On August 27, between the hours of 8:30 and approx 11:30. something has caused our MF tpt's to go awry. MF team says nothing changed. DB team says nothing changed. the error we are now getting is:
TPT_INFRA: TPT04013: Error: opening temporary job script output file: "EDC5000I No error occurred." (0).
Job script compilation failed.
The tpt script is quite simple (for the test).
000001 DEFINE JOB load_L_LOCATION_STG(
000002 DEFINE OPERATOR DDL_OPERATOR
000003 TYPE DDL
000004 ATTRIBUTES
000005 (
000006 VARCHAR PrivateLogName = 'L_LOCATION_STG_ddllog',
000007 VARCHAR TdpId = @MyTdpId,
000008 VARCHAR UserName = @MyUserName,
000009 VARCHAR UserPassword = @MyPassword,
000010 VARCHAR logonmech = ldap,
000011 VARCHAR ARRAY ErrorList =['3807','2580']
000012 );
000013 STEP drop_error_tables_and_delete_data
000014 (
000015 APPLY
000016 ('DROP TABLE LDWM_STG_TEMP_DB.L_LOCATION_STG_W;')
000017 TO OPERATOR (DDL_OPERATOR () );
000018 );
000019 );
We are getting this error on all TPT's reagradless of which id runs them. this was not the case prior to Aug 27th.
Our MF security is handled by Top Secret application.
This same tpt script also runs perfectly from a linux platform
ANY SUGGESTIONS OUT THERE? we are at a complete loss.
one thing our MF folks did was chagne the UID for the prod users and jobs began running successfully, but some spardically failed with different message. odd thing is the exactsaem job may run just fine 3 times in a row then fail, then fail 2 times, run fine 1, faile again, etc....