Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

Load data from Oracle to Teradata with correct character set

$
0
0

Hello guys, I am new here in this forum and I would appreciate your help.

 

I had a problem to load some data from an Oracle database to Teradata using Data Direct tool. All data were loaded correctly to target, but some special brazilian characters (like "Ç") came to Teradata as a "?" character.

 

I tryed to put "USING CHARACTER SET UTF8" in the beginning of the script file, I tryed to put "-e UTF8" in the command line. Nothing works as I expect.

 

I Looking foward to put some information at OPERATOR TYPE ODBC to specify which character set the driver will use, but I didn't find this parameter.

 

Does anyone can help me?

 

Thanks.

 

/* SCRIPT FILE */

USING CHARACTER SET UTF8

DEFINE JOB "ODILoadaa1a1e33-521c-4b7b-95c8-c829ecee6368" 

DESCRIPTION 'ODI step name: prCARGA_TPT2 ODI step #:12 from session #aa1a1e33-521c-4b7b-95c8-c829ecee6368'

(

DEFINE SCHEMA Oracle_DataSource_Schema DESCRIPTION 'Source table for PESSOA_JURIDICA'

  (

  OID_PESSOA NUMBER(22,0),OID_PES_MATRIZ NUMBER(22,0),NOM_RAZAO_SOCIAL VARCHAR(115),NOM_FANTASIA VARCHAR(55),DAT_CONSTITUICAO VARCHAR(75),OID_NAT_JURIDICA NUMBER(22,0),QTD_FILIAL NUMBER(5,0),QTD_CLIENTE NUMBER(6,0),NUM_REGISTRO VARCHAR(16),DAT_REGISTRO VARCHAR(75),DAT_ALT_CONTRATUAL VARCHAR(75),FLG_MATRIZ VARCHAR(1),OID_CRZ_CAL_SOCIAL NUMBER(22,0),OID_NAT_EMPRESA NUMBER(22,0),OID_CAT_BACEN NUMBER(22,0),TPO_REGIME_TRIBUTARIO NUMBER(1,0),FLG_SUBESTAB_PODER VARCHAR(1),OID_TPO_COOPERATIVA NUMBER(22,0),VLR_FATURAMENTO NUMBER(14,2),FLG_AUDITORIA VARCHAR(1),FLG_AREA_ATUACAO VARCHAR(1),NUM_ARQUIVO VARCHAR(16),DAT_ARQUIVO VARCHAR(75),COD_USU_ATUALIZACAO NUMBER(5,0),DAT_ATUALIZACAO VARCHAR(75),NUM_CNPJ VARCHAR(14),QTD_FUNCIONARIO NUMBER(7,0),ANO_FATURAMENTO NUMBER(4,0)

  );

  DEFINE OPERATOR ODBC_Operator

  DESCRIPTION 'TPT ODBC Operator'

  TYPE ODBC

  SCHEMA Oracle_DataSource_Schema

  ATTRIBUTES

  (

  VARCHAR  PrivateLogName   = @ODBCPrivateLogName,

  VARCHAR  DSNName             = @DSNName,

  VARCHAR  UserName             = @ODBCUserName,

  VARCHAR  UserPassword       = @ODBCPassword,

  VARCHAR  SelectStmt             = @SelectStmt

  );

  APPLY

 

 

(

'INSERT INTO A_STG_SICREDI_OWNER_T.PESSOA_JURIDICA

(

  OID_PESSOA                                                  ,OID_PES_MATRIZ                                              ,NOM_RAZAO_SOCIAL                                            ,NOM_FANTASIA                                                ,DAT_CONSTITUICAO                                            ,OID_NAT_JURIDICA                                            ,QTD_FILIAL                                                  ,QTD_CLIENTE                                                 ,NUM_REGISTRO                                                ,DAT_REGISTRO                                                ,DAT_ALT_CONTRATUAL                                          ,FLG_MATRIZ                                                  ,OID_CRZ_CAL_SOCIAL                                          ,OID_NAT_EMPRESA                                             ,OID_CAT_BACEN                                               ,TPO_REGIME_TRIBUTARIO                                       ,FLG_SUBESTAB_PODER                                          ,OID_TPO_COOPERATIVA                                         ,VLR_FATURAMENTO                                             ,FLG_AUDITORIA                                               ,FLG_AREA_ATUACAO                                            ,NUM_ARQUIVO                                                 ,DAT_ARQUIVO                                                 ,COD_USU_ATUALIZACAO                                         ,DAT_ATUALIZACAO                                             ,NUM_CNPJ                                                    ,QTD_FUNCIONARIO                                             ,ANO_FATURAMENTO                                             ,TPO_DML                                                     ,DAT_ATUALIZACAO_ORIGEM                                      ,NUM_SCN_TRANSACAO                                           ,NUM_IDF_REGISTRO                                            

)

VALUES 

(

  :OID_PESSOA                    ,:OID_PES_MATRIZ                ,:NOM_RAZAO_SOCIAL              ,:NOM_FANTASIA                  ,:DAT_CONSTITUICAO               (TIMESTAMP(0), FORMAT ''YYYYMMDDHHMISS''),:OID_NAT_JURIDICA              ,:QTD_FILIAL                    ,:QTD_CLIENTE                   ,:NUM_REGISTRO                  ,:DAT_REGISTRO                   (TIMESTAMP(0), FORMAT ''YYYYMMDDHHMISS''),:DAT_ALT_CONTRATUAL             (TIMESTAMP(0), FORMAT ''YYYYMMDDHHMISS''),:FLG_MATRIZ                    ,:OID_CRZ_CAL_SOCIAL            ,:OID_NAT_EMPRESA               ,:OID_CAT_BACEN                 ,:TPO_REGIME_TRIBUTARIO         ,:FLG_SUBESTAB_PODER            ,:OID_TPO_COOPERATIVA           ,:VLR_FATURAMENTO               ,:FLG_AUDITORIA                 ,:FLG_AREA_ATUACAO              ,:NUM_ARQUIVO                   ,:DAT_ARQUIVO                    (TIMESTAMP(0), FORMAT ''YYYYMMDDHHMISS''),:COD_USU_ATUALIZACAO           ,:DAT_ATUALIZACAO                (TIMESTAMP(0), FORMAT ''YYYYMMDDHHMISS''),:NUM_CNPJ                      ,:QTD_FUNCIONARIO               ,:ANO_FATURAMENTO               ,''I'',timestamp ''2014-01-01 00:00:00'',   0,   0

);'

)

TO OPERATOR 

(

$LOAD [@LoadInstances]

)

SELECT 

  OID_PESSOA,OID_PES_MATRIZ,NOM_RAZAO_SOCIAL,NOM_FANTASIA,DAT_CONSTITUICAO,OID_NAT_JURIDICA,QTD_FILIAL,QTD_CLIENTE,NUM_REGISTRO,DAT_REGISTRO,DAT_ALT_CONTRATUAL,FLG_MATRIZ,OID_CRZ_CAL_SOCIAL,OID_NAT_EMPRESA,OID_CAT_BACEN,TPO_REGIME_TRIBUTARIO,FLG_SUBESTAB_PODER,OID_TPO_COOPERATIVA,VLR_FATURAMENTO,FLG_AUDITORIA,FLG_AREA_ATUACAO,NUM_ARQUIVO,DAT_ARQUIVO,COD_USU_ATUALIZACAO,DAT_ATUALIZACAO,NUM_CNPJ,QTD_FUNCIONARIO,ANO_FATURAMENTO

FROM OPERATOR(ODBC_Operator [@ODBCInstances]); 

);

 

/* CONFIGURATION FILE */

/**********************************/

/* Values for ODBC operator */

/**********************************/

ODBCInstances            = 1,

ODBCPrivateLogName       = 'odbclog',

DSNName                  = 'DATABASE',

ODBCUserName             = 'USERNAME',

ODBCPassword             = 'PASS',

SelectStmt               = 'SELECT OID_PESSOA,OID_PES_MATRIZ, NOM_RAZAO_SOCIAL,NOM_FANTASIA,TO_CHAR (DAT_CONSTITUICAO, ''YYYYMMDDHH24MISS'') AS DAT_CONSTITUICAO,OID_NAT_JURIDICA,QTD_FILIAL,QTD_CLIENTE,NUM_REGISTRO,TO_CHAR (DAT_REGISTRO, ''YYYYMMDDHH24MISS'') AS DAT_REGISTRO,TO_CHAR (DAT_ALT_CONTRATUAL, ''YYYYMMDDHH24MISS'') AS DAT_ALT_CONTRATUAL,FLG_MATRIZ,OID_CRZ_CAL_SOCIAL,OID_NAT_EMPRESA,OID_CAT_BACEN,TPO_REGIME_TRIBUTARIO,FLG_SUBESTAB_PODER,OID_TPO_COOPERATIVA,VLR_FATURAMENTO,FLG_AUDITORIA,FLG_AREA_ATUACAO,NUM_ARQUIVO,TO_CHAR (DAT_ARQUIVO, ''YYYYMMDDHH24MISS'') AS DAT_ARQUIVO,COD_USU_ATUALIZACAO,TO_CHAR (DAT_ATUALIZACAO, ''YYYYMMDDHH24MISS'') AS DAT_ATUALIZACAO,NUM_CNPJ,QTD_FUNCIONARIO,ANO_FATURAMENTO FROM SICREDI_OWNER.PESSOA_JURIDICA;'

/**********************************/

/* Values for LOAD operator */

/*********************************/

LoadInstances            = 1,

LoadPrivateLogName       = 'loadlog',

TargetTable              = 'A_STG_SICREDI_OWNER_T.PESSOA_JURIDICA',  

TargetTdpId              = 'TDHom_a_stgint_odi_run',

TargetUserName           = 'USER',

TargetUserPassword       = 'PASS',

LogTable                 = 'A_STG_SICREDI_OWNER_W.PESSOA_JURIDICA_LOG'

ErrorTable1              = 'A_STG_SICREDI_OWNER_W.PESSOA_JURIDICA_ET'

ErrorTable2              = 'A_STG_SICREDI_OWNER_W.PESSOA_JURIDICA_UV'

 

 /* DEFINITION OF SOURCE TABLE AT ORACLE */

CREATE TABLE "SICREDI_OWNER"."PESSOA_JURIDICA"

  (

    "OID_PESSOA"      NUMBER NOT NULL ENABLE,

    "OID_PES_MATRIZ"  NUMBER,

    "NOM_RAZAO_SOCIAL" VARCHAR2(115 BYTE) NOT NULL ENABLE,

    "NOM_FANTASIA"    VARCHAR2(55 BYTE),

    "DAT_CONSTITUICAO" DATE,

    "OID_NAT_JURIDICA" NUMBER,

    "QTD_FILIAL"      NUMBER(5,0),

    "QTD_CLIENTE"     NUMBER(6,0),

    "NUM_REGISTRO"    VARCHAR2(16 BYTE),

    "DAT_REGISTRO" DATE,

    "DAT_ALT_CONTRATUAL" DATE,

    "FLG_MATRIZ"           VARCHAR2(1 BYTE),

    "OID_CRZ_CAL_SOCIAL"   NUMBER,

    "OID_NAT_EMPRESA"      NUMBER,

    "OID_CAT_BACEN"        NUMBER,

    "TPO_REGIME_TRIBUTARIO" NUMBER(1,0),

    "FLG_SUBESTAB_PODER"   VARCHAR2(1 BYTE),

    "OID_TPO_COOPERATIVA"  NUMBER,

    "VLR_FATURAMENTO"      NUMBER(14,2),

    "FLG_AUDITORIA"        VARCHAR2(1 BYTE),

    "FLG_AREA_ATUACAO"     VARCHAR2(1 BYTE),

    "NUM_ARQUIVO"          VARCHAR2(16 BYTE),

    "DAT_ARQUIVO" DATE,

    "COD_USU_ATUALIZACAO" NUMBER(5,0) NOT NULL ENABLE,

    "DAT_ATUALIZACAO" DATE NOT NULL ENABLE,

    "NUM_CNPJ"       VARCHAR2(14 BYTE),

    "QTD_FUNCIONARIO" NUMBER(7,0),

    "ANO_FATURAMENTO" NUMBER(4,0)

  )

 

/* DEFINITION OF TERADATA TARGET TABLE */

CREATE MULTISET TABLE A_STG_SICREDI_OWNER_T.PESSOA_JURIDICA ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      OID_PESSOA NUMBER(22,0) NOT NULL,

      OID_PES_MATRIZ NUMBER(22,0),

      NOM_RAZAO_SOCIAL VARCHAR(115) CHARACTER SET UNICODE NOT CASESPECIFIC NOT NULL,

      NOM_FANTASIA VARCHAR(55) CHARACTER SET UNICODE NOT CASESPECIFIC,

      DAT_CONSTITUICAO TIMESTAMP(0),

      OID_NAT_JURIDICA NUMBER(22,0),

      QTD_FILIAL NUMBER(5,0),

      QTD_CLIENTE NUMBER(6,0),

      NUM_REGISTRO VARCHAR(16) CHARACTER SET UNICODE NOT CASESPECIFIC,

      DAT_REGISTRO TIMESTAMP(0),

      DAT_ALT_CONTRATUAL TIMESTAMP(0),

      FLG_MATRIZ VARCHAR(1) CHARACTER SET UNICODE NOT CASESPECIFIC,

      OID_CRZ_CAL_SOCIAL NUMBER(22,0),

      OID_NAT_EMPRESA NUMBER(22,0),

      OID_CAT_BACEN NUMBER(22,0),

      TPO_REGIME_TRIBUTARIO NUMBER(1,0),

      FLG_SUBESTAB_PODER VARCHAR(1) CHARACTER SET UNICODE NOT CASESPECIFIC,

      OID_TPO_COOPERATIVA NUMBER(22,0),

      VLR_FATURAMENTO NUMBER(14,2),

      FLG_AUDITORIA VARCHAR(1) CHARACTER SET UNICODE NOT CASESPECIFIC,

      FLG_AREA_ATUACAO VARCHAR(1) CHARACTER SET UNICODE NOT CASESPECIFIC,

      NUM_ARQUIVO VARCHAR(16) CHARACTER SET UNICODE NOT CASESPECIFIC,

      DAT_ARQUIVO TIMESTAMP(0),

      COD_USU_ATUALIZACAO NUMBER(5,0) NOT NULL,

      DAT_ATUALIZACAO TIMESTAMP(0) NOT NULL,

      NUM_CNPJ VARCHAR(14) CHARACTER SET UNICODE NOT CASESPECIFIC,

      QTD_FUNCIONARIO NUMBER(7,0),

      ANO_FATURAMENTO NUMBER(4,0),

      TPO_DML CHAR(1) CHARACTER SET UNICODE NOT CASESPECIFIC NOT NULL,

      DAT_ATUALIZACAO_ORIGEM TIMESTAMP(6) NOT NULL,

      NUM_SCN_TRANSACAO NUMBER(20,0),

      NUM_IDF_REGISTRO NUMBER(20,0) NOT NULL)

NO PRIMARY INDEX ;

 

 

Forums: 

Where is the teradata datebase in the Teradata Tools and Utilities (TTU) 15.00 for windows?

$
0
0

Good afternoon,
 
Does anybody know where is the teradata datebase in the Teradata Tools and Utilities (TTU) 15.00 package instalation for windows? Is there any chance to load a teradata demo database in order to start running queries?

Forums: 

ODBCOperator: TPT17187: Failed to obtain shared memory segment for for column data buffer, due to error 48.

$
0
0

This error occurs when using the Progress DataDirect 64-bit SQL Server ODBC drivers and moving data via TPT.  We have contacted Progress DataDirect and they have investigated this issue and their reponse was this was a Teradata issue.  I've asked them to contact you as they may be partners (not sure) so you may be also receiving and official request from them. :/
Here are symptoms of the issue which are very specific.
1. This error only occurs with the 64-bit SQL Server Progress DataDirect WireProtocol Drivers.  Moving via 32-bit Progress drivers do not cause the issue.  The target data source was using the generic Teradata ODBC drivers.
2. It only occurs when moving data which is all of data type SMALLINT on SQL Server.  You can have 5 SMALLINT columns or 1 SMALLINT column which you are moving.  The number of columns I believe doesn't matter it is just that the data is all SMALLINT. If you add one other column whih is non-SMALLINT then you will not receive this error.
We're using TTU 15 from the website and the ODBC Operator to extract the data from SQL Server and the Stream Operator to load the data to Teradata.
I have scripts to duplicate the error if someone from Teradata would like to contact me via email.  Thank you for your help.
ODBCOperator: TPT17187: Failed to obtain shared memory segment for for column data buffer, due to error 48.
ODBCOperator: TPT17174: Error 0 allocating memory for row size buffer
 

Forums: 

Error Number : 120801 Error Message : Teradata WB tlogview failed when reading

$
0
0

hi Guys,
hope you are doing great, we are facing a problem in out ETL loads.
during BULK loads of trasactional data sometimes we are facing following error:
 
Error Number : 120801 Error Message : Teradata WB tlogview failed when reading <C:/Program Files/Teradata/Client/15.00/Teradata Parallel Transporter/logs/DP_EDW_STAGING_KzInvoice_Basis_1_Tax_44-120633.out> log file.
 
this Error dont have permanent reason, sometimes it occur sometime same job on re executing runs successfuly.
can you please guide what is the reason of this error and how we can handle it.
 
thanks and Regards,
Noman
 

Forums: 

uninstall_TTU.vbs question, it's bug or feature?

$
0
0

Hi all,
I do not know if it's a "bug" or a "feature" so I have to ask to more experienced people.

The scenario is the one where you have to automatize the installation of the TTU on a ton of client.
Might be the client (pc) have already a TTU's version istalled or not.
Might be on the pc there is a process using the client or not.
So in order to make a robust script I was doing some test/experiment and I have found one issue using the script uninstall_TTU.vbs.
To show the "problem" :
A)On a machine I install :
-Shared ICU Libraries for Teradata
-Teradata GSS Client nt-i386
-Teradata CLIv2 versione
-ODBC Driver for Teradata
-OLE DB Provider for Teradata
B) on a new DOS Shell window I run a script that open an ODBC connection and execute a query every 1 second
C) on a new DOS Shell window I run a script that open an OLEDB connection and execute a query every 1 second
At this point there are two process using the client (dll)
D) on a new DOS shell window I run "cscript uninstall_TTU.vbs ALL /PRIORTOVERSION:99.00.00.00" and wait the completation
Since there are at least two process using the ODBC & OLEDDB dll's my expectation was that the uninstall_TTU.vbs *fails* or return an exit code to perform a system restart to complete the unistallation.
Why the uninstall_TTU.vbs do not return a specific exit code to force the system restart? It's expected by design?
After the above step D you can istall the same tools/product of step A *but* at different version level.
So on the system you have some process using the same utlity/tool but a different version level, it's safe?
ciao
GIovanni

Forums: 

TPT - problem with processing file

$
0
0

Hi everyone,
I have job consisting of several steps. I want to load data from flat files to database, but I have problem with one step. Below, I paste information from log file of this step when problem occured:
-----------------------------------
Teradata Parallel Transporter Version 14.00.00.06
Job log: /DWH/Develop/proc_ods/wyrocznia/wyrocznia_l_ods/tmp/wyrocznia_l_ods_WYR_ORDER_DOCUMENT_L_20150309_165331-1016.out
Job id is wyrocznia_l_ods_WYR_ORDER_DOCUMENT_L_20150309_165331-1016, running on dwd02
Teradata Parallel Transporter SQL DDL Operator Version 14.00.00.06
$DDL: private log not specified
$DDL: connecting sessions
$DDL: sending SQL requests
$DDL: TPT10508: RDBMS error 3807: Object 'IVM_DB_TMP.ET_WYR_ORDER_DOCUMENT_L' does not exist.
$DDL: TPT18046: Warning: error is ignored as requested in ErrorList
$DDL: TPT10508: RDBMS error 3807: Object 'IVM_DB_TMP.UV_WYR_ORDER_DOCUMENT_L' does not exist.
$DDL: TPT18046: Warning: error is ignored as requested in ErrorList
$DDL: Rows Deleted:  0
$DDL: disconnecting sessions
$DDL: Total processor time used = '0.058629 Second(s)'
$DDL: Start : Mon Mar  9 16:53:33 2015
$DDL: End   : Mon Mar  9 16:53:33 2015
Job step setup_step completed successfully
Teradata Parallel Transporter $FILE_READER: TPT19006 Version 14.00.00.06
$FILE_READER: TPT19008 DataConnector Producer operator Instances: 1
Teradata Parallel Transporter SQL Inserter Operator Version 14.00.00.06
$INSERTER: private log not specified
$FILE_READER: TPT19003 ECI operator ID: $FILE_READER-5124192
$FILE_READER: TPT19222 Operator instance 1 processing file '/DWH/Develop/SourceData_incr/wyrocznia_ods/load/order_document.dat'.
$INSERTER: connecting sessions
$FILE_READER: TPT19350 I/O error on file '/DWH/Develop/SourceData_incr/wyrocznia_ods/load/order_document.dat'.
$FILE_READER: TPT19003 SYSTEM_BUFFERMODE: 'no'
$FILE_READER: TPT19221 Total files processed: 0.
$INSERTER: Total Rows Sent To RDBMS:      0
$INSERTER: Total Rows Applied:            0
$INSERTER: disconnecting sessions
$INSERTER: Total processor time used = '0.059809 Second(s)'
$INSERTER: Start : Mon Mar  9 16:53:33 2015
$INSERTER: End   : Mon Mar  9 16:53:34 2015
Job step main_step terminated (status 12)
Job wyrocznia_l_ods_WYR_ORDER_DOCUMENT_L_20150309_165331 terminated (status 12)
 
-------------------------------------------------------------------------------------------------------------------
I checked the order_document.dat file, but there isn't any record (file is empty, but is normally). I noticed, when I add one sample record to this file, then problem disappear.
I will be very grateful for help / suggest on how to eliminate the problem. I would like, the TPT didn't break the job when any flat file will be empty.
 

Forums: 

MLOAD - UTILITY

$
0
0

I am new to teradata. We just creared database name SPIDER_TBLS ,  we created one test table called AATest, We have one flat file sitting in a landing pad called AATEST.txt ...it has 10 columns....  I read many example and steps but could not follow. DO you have step by step to read data from flat file into teradata. Like steps with screen shots....my email is arow2@hotmail.com  I really appreciae ..We have asked to migrate SQL to Teradata and doing smal test first.
 
Second question; From SQL server - I download ODBC drivers - How i can connect from SQL to select from the teradata table - the above load that i have asked Again thank you

Forums: 

Mload Update: Records are not getting updated

$
0
0

Hello All,
I am running a mload job. In the source file, the fields are having different data types like Decimal, Char & Varchar. But when i am running the job, major volume of records are going to error tables with error code 2797 (saying MLoad MARK MISSING UPDATE). That means its not finding records with given where conditions.
But when i am taking a sample record out of those records from error table and executing with the same combinations as previous where conditions in SQL assistant, I can see already records are existing in target table. Then why it didn't find it in mload update.. 
So I am doubting some discrepancy in in_type_cd column which is having CHAR data type. Do we need to do any special handling for CHAR fields??
---------------------------------------------------------
Primary Index is emp_no
---------------------------------------------------------
.LAYOUT datalayout INDICATORS;
.FIELD set_flag  1 CHAR(1);
.FIELD in_emp_no  * DECIMAL(18);
.FIELD in_dt_tm  * CHAR(19);
.FIELD in_type_cd  * CHAR(10);
.FIELD in_line_id  * VARCHAR(10);
-----------------------------------------------------------
.DML LABEL test_dml mark duplicate update rows;
UPDATE TABLE_ABC
     SET
     start_dt_tm    =:in_dt_tm,
     type_cd     =:in_type_cd,
     line_id           =:in_line_id,
     WHERE emp_no                    =:in_emp_no
     and start_dt_tm      =:in_dt_tm
     and type_cd    =:in_type_cd
     and (line_id                  =:in_line_id
       or  line_id                = 'CRIC'
       or  line_id                = 'HOCK'
       or  line_id                = 'TEN');
--------------------------------------------------------------
 
 
Thanks
Abhi

Forums: 

Can we pass schema and sql statement as variables to tpt script

$
0
0

Hi,
Is it possible to pass schema definition and sql statement in job variables file or command line parameters? %INCLUDE needs exact filename to be mentioned for schema definition which makes the tpt script for one particular extract. We are working on a generic script for 100+ tables so that we can pass the schema and sql in jv file along with instances.
 
DEFINE SCHEMA SOCPARC_SUBSET_PP
DESCRIPTION 'DW_ADDRESS'
(
@SCHEMADEF
);
VARCHAR SelectStmt        = @SQL
 
Job Variable script:
SCHEMADEF=`cat xyz.txt`
SQL=`cat abc.txt`
 

Forums: 

Need to cast an invalid date Time to a valid Date Time

$
0
0

I need cast an invalid date-Time to a valid Date-Time.
Example:  12/5/2014 2:55:29 PM   to   05/12/2014 14:55:29        DD/MM/YY hh:mi:ss
 
Ths

Forums: 

Teradata destination ODBC connections failing in SSIS job

$
0
0

Windows Server 2012 R2

SQL Server Integration Services 2012

Teradata 13.10.00.14, Driver 14.10.00.06

 

We have a situation where ODBC connections to Teradata destination database keep getting errors from time to time, not every night though.

Some nights regular load jobs execute successfully and sometimes they get errors "There was an error trying to establish an Open Database Connectivity (ODBC) connection with the database server." - one or more component failed validation.

 

Here is a picture of one example: http://goo.gl/Z1ofuQ

 

The faulty component is usually different, not the same every time. Package may also differ. There are 4 packages starting parallelly at the same time, each having 1 or 2 sequences of dataflows and sql tasks (about 60 dataflows and 40 sql tasks in total in those packages). Packages have been migrated from DTS to SSIS and this kind of error-situation has never occurred before.

So we checked what happens on the Teradata side - many simultaneous sessions, validation sessions at first then TPT Load and ODBC sessions, but the number of concurrent open sessions never exceeds something like 30-40. And there is a parameter in Teradata indicating that max allowed sessions is 120.

 

Two different kinds of connection managers used - Teradata Connection Manager, used in Attunity Teradata Destination component for TPT Loading and ODBC Connection manager to execute SQL Tasks and also for loading to Teradata destination tables in case of low number of records being transfered.

TPT Loads take 14 sessions each, ODBC destination and validation sessions apparently 1 per validation/component.

 

What could be the reason for this error? Where to look, what to check?

 

Thanks for any advice!

 

 

Forums: 

TPT - Delimited Data Parsing error: Invalid multi-byte character

$
0
0

I am developing a TPT load (Unix envirnoment) for a data file (with UTF-8 encoding) to populate a Teradata table with columns defined as VARCHAR() CHARACTER SET UNICODE. One character in the data file is causing my load to fail. If I remove this character the load completes successfully.
When the data file is viewed via Winscp the problem character appears as a square box, when I copy and paste the character into a Word document it appears as a "smiley face" emoji/emoticon type thing. Winscp details the following attributes for the character: character 55357 (oxD83 encoding utf-8)
whilst a bit of a googling suggests the following character 55357, unicode code point U+D83D, UTF-8 (Hex) ed a0 bd 
I'm afraid this means nothing to me, what do I need to do to ensure that the TPT load job doesn't fail for these spurious UTF-8 characters which appear not to be supported by Teradata UTF-8 Unicode character set ? I don't want to pre process the file to remove this specific character as tomorrow I could easily receive a file with a different problem character.
Thanks for any assistance
 
 
 

Forums: 

TPT Stream Loader - Create Macro Access error on target database (not working database).

$
0
0

Hello:
When trying to use TPT to stream data from SQL Server we're running into the following error.
'The user does not have CREATE MACRO access to database TargetDatabase'.
The TargetDatabase in this case is different from the WorkingDatabase below.  My understanding is that the default macro database is the restart table database which is the working database.  If the working database is defined then the logs and error tables should be qualified.
So does the target database need create macro access for the user when the working database has been defined?  We can set the MacroDatabase attribute, but I didn't think it was necessary.
Thank you for your help.  This is TPT 15.  The database version is 14.10. Below is the stream operator we are using.

DEFINE OPERATOR StreamOperator

TYPE Stream

SCHEMA *

ATTRIBUTES (

VARCHAR Tdpid = 'MyDSN'

,VARCHAR UserName = 'user'

,VARCHAR UserPassword = 'password'

,VARCHAR LogonMech = 'ldap'

,VARCHAR WorkingDatabase = 'MyWorkingDatabase'

,VARCHAR ErrorTable = 'MyWorkingDatabase.TargetTable_ET'

,VARCHAR LogTable = 'MyWorkingDatabase.TargetTable_tlog'

,VARCHAR DateForm = 'integerDate'

);

 

 

 

 
 

Forums: 

TDWallet support for non-interactive mode.

$
0
0

Hello Forum
 
Does TDWallet support Non-Interactive mode?
I have been trying to insert few keys in TDWallet using Shell Script but It does not seems to be working.
 
After some research I have found we can create users using Expect Automation tool for Linux and was able to automate using Expect and Python script. But I am looking for TDWallet direct support for non-interactive mode.
 
Will this be the feature for TDWallet 15 or 15.10? or is it available in 14.10 release?

Forums: 

FastExport ISSUE !!!

$
0
0

Greetings TD experts, 
I am a noob in using FastExport and I am facing a problem during runtime. After executing the script in command, I get the following error:

**** 14:36:41 UTY0847 Warning: RDBMS or network down. Trying to logon again.

**** 14:37:41 UTY8400 Network or RDBMS down,Cli error 207

**** 14:37:41 UTY2410 Total processor time used = '0.0936006 Seconds'

     .       Start : 14:28:25 - WED MAR 25, 2015

     .       End   : 14:37:41 - WED MAR 25, 2015

     .       Highest return code encountered = '12'.

 

Here is my script:

 

.LOGTABLE <db>.<tablename_lg>;

.RUN FILE <path>.logon.fxp; 

<databasename>;

.BEGIN EXPORT SESSIONS 20;

 

.EXPORT OUTFILE <path><filename>.txt MODE RECORD FORMAT TEXT;

 

SELECT  cast(Ven_ID as (VARCHAR(2))||','||cast (VERTICAL as VARCHAR(17))||','||cast(CATEGORY as VARCHAR(8)) 

FROM <db>.<table>;

 

.END EXPORT;

.LOGOFF; 

 

I created a logon file which contains the username password as someone suggested its secure this way.

script:

.LOGON usernname,pwd;

 

 

Please help me out in whats going wrong here coz I am not able to move further.  To my understanding from the error it says network is down. But in reality it isn't. The tool stays to about 10-15 mins after executing the logon. Is there something wrong with my logon? I am working in windows platform. And I am not sure where to find my tpid incase I need to mention it, but the syntax says its optional and a default value ll be added in case if its not given. 

Thanks a lot people in advance :) 

Have a great day !

Forums: 

Merging of Bteq and shell scripts

$
0
0

I want to use shell variables in bteq commands and shell commands in bteq script. e.q  create table $variable (some xyz column list); and some sort of shell  commands within bteq scripts.
So can you please assit me in doing it.
 
Thanks

Forums: 

Loading from DB2 to Teradata with SSIS 2012 converts content with special characters ("zero characters") to NULL.

$
0
0

Loading from DB2 to Teradata with SSIS 2012 converts content with special characters ("zero characters") to NULL.

 

When I hit PREVIEW in SSIS DB2 OLE DB Source component, I can see the data correctly but when I load it into some destination file or database table, the content will be lost. NULL. Other rows are OK but the ones with some special characters will get lost.

http://hot.ee/phil/work/DB2_zero_character.png

 

How to fix this? How to load correctly from DB2 with SQL Server Integration Services?

 

I have tried Microsoft OLEDB driver for DB2 and also IBM OLEDB provider for DB2, they both have the same issue.

Forums: 

FASTLOAD: Can it skip blank lines?

$
0
0

I did a few searches and couldn't turn up anything on this topic so I thought I'd ask.  I've already worked around it by preprocessing my data file through sed to remove any blank lines.  For future reference, is there an option in fastload that will have it skip blank lines?
Thx,
Rik

Forums: 

Updating SQL assistant 15

$
0
0

Hello, I recently downloaded TTU v15.00, which seems to have a lot of bugs on my machine. I see some comments about updates to SQL Assistant, but I can't find an update version or updater. How can I update SQL assistant to the latest version?
Thanks

Forums: 

bteq .OS calls to UNIX "date" command

$
0
0

I have bteq .RUN command sandwiched by .OS calls which log the start and end date times of the .RUN. The dates are gotten by calls to UNIX 'date' command. The START date is fine, however, my COMPLETED date is always equal to my START date. I don't see how that's possible because the 'date' command is invoked fresh on the COMPLETED line. How can I fix this to get the real, actual COMPLETED date to print?
Here's the code:
.OS echo Running TERADATA script <script name>        on `date`>&2;                      <=========== this date is correct.
.RUN FILE = <script file>       ;
.IF ERRORCODE <> 0 THEN .GOTO BADEXIT;

.LABEL GOODEXIT
.OS echo TERADATA file <script name>         COMPLETED on `date` >&2;                  <=========== this date is equal to the date above, even though it should be several minutes later.
.QUIT 0;

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>