Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

Concatenate one column values into single row value

$
0
0

Hi,
Can you help me how to fetch multicolumn index columns table vice.
From dbc.indices we will get  o/p as below:
==============================
DBName TBName IndexType  Columnname
ABC        Table1        P                 ID1
ABC        Table1        P                 ID2
But need the O/P as below:
===================
DBName TBName IndexType  Columnname
ABC        Table1        P                 ID1, ID2
=============================================
Kindly help me the SQL Query.

Forums: 

FAST LOAD Failing in mainframe job

$
0
0

Hi,
 
I am trying to load a CSV file to a teradata table using FLOAD. I did an FTP of the file from my desktop to the mainframe server. The values in the file are comma separated. The Fload script in the mainframe job has the ' set record vartext ","' command before the DEFINE command. My job keeps failing giving an error message " FDL4800 invalid fastload statement"
SET RECORD VARTEXT ","  ;
I am not sure what is the mistake here. Please advise.
 
Thanks in advance for the support.
 
 
 
 
 

Forums: 

silent installation of teradata client on win7 - response file

$
0
0

I'm trying to automate the installation on win7 but it did not work. I'm looking for the response file (.iss) no response file was generated!?
TeradataToolsAndUtilitiesBase__windows_i386.14.10.09.00.exe /r /f1"C:\temp\ttu.iss"
can someone help please?

Forums: 

TPT DataConnector Error : Delimited Data Parsing error: Input record exceeds allocated storage

$
0
0

TPT DataConnector producer error : "Delimited Data Parsing error: Input record exceeds allocated storage"
Hello,
I am using TPT Data connector producer and sql inserter operator to load clob data using tab delimited file. Max length of text I am loading into CLOB is 90000 approx.
I am unable to load the files as it is failing with Below error.
FILE_READER: TPT19350 I/O error on file 'C:\redlogs\sample_data.fmt'.
FILE_READER: TPT19134 !ERROR! Fatal data error processing file 'C:\redlogs\sample_data.fmt'. Delimited Data Parsing error: Input record exceeds allocated storage on row 16.
FILE_READER: TPT19003 TPT Exit code set to 8.
TPT File :
USING CHARACTER SET ASCII
DEFINE JOB L_2_txtTABLE_FROM_FILE
DESCRIPTION 'LOAD L_2_txt TABLE FROM A FILE'
(
  DEFINE SCHEMA L_2_txt_SCHEMA
  DESCRIPTION 'TABLE L_2_txt SCHEMA'
  (
      "COL1"                                     varchar(39)
    , "Col2"                                     varchar(39)
    , "Col3"                                     CLOB(6213140) AS DEFERRED BY NAME
  );
 
  DEFINE OPERATOR DDL_OPERATOR()
  DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'
  TYPE DDL
  ATTRIBUTES
  (
    VARCHAR ARRAY ErrorList = ['3706','3803','3807'],
    VARCHAR DateForm,
    VARCHAR PrivateLogName = 'L_2_txt_DDL',
    VARCHAR TdpId = 'xxx.xxx.xx.xxx',
    VARCHAR UserName = 'USR',
    VARCHAR UserPassword = @TDPassword,
    VARCHAR AccountID
   );
 
  DEFINE OPERATOR FILE_READER()
  DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'
  TYPE DATACONNECTOR PRODUCER
  SCHEMA L_2_txt_SCHEMA
  ATTRIBUTES
  (
    VARCHAR Format = 'Delimited',
    TextDelimiter = 'TAB',    
    VARCHAR IndicatorMode = 'N',
    VARCHAR OpenMode = 'Read',
    VARCHAR FileName = 'C:\redlogs\sample_data.fmt'
  );
 
    DEFINE OPERATOR SQL_INSERTER ()
    DESCRIPTION 'TERADATA INSERTER UTILITY'
    TYPE INSERTER
    INPUT SCHEMA *
    ATTRIBUTES
    (
    VARCHAR TraceLevel = 'None',
    VARCHAR PrivateLogName = 'ins_log',
    VARCHAR Tdpid = 'xxx.xxx.xx.xxx',
    VARCHAR UserName = 'USR',
    VARCHAR UserPassword = @TDPassword
    );
 
  STEP setup_tables
  (
    APPLY
    ('DROP TABLE DLS.L_2_txt_E1;'),
    ('DROP TABLE DLS.L_2_txt_E2;'),
    ('DROP TABLE DLS.L_2_txt_WT;'),
    ('DROP TABLE DLS.L_2_txt_LT;')
    TO OPERATOR (DDL_OPERATOR () );
  );
 
    STEP CREATE_SOURCE_TABLE
    (
    APPLY
    ('drop table DLS.L_2_txt ;'),
    ('create table DLS.L_2_txt , NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL
   (
        col1 VARCHAR(39),
        col2 VARCHAR(39),
        col3 clob(6213140)
    ) primary index L_2_txt_idx_PR (COL1);')
    TO OPERATOR ( DDL_OPERATOR () );
    );
 
    STEP LOADING_DATA_TO_SOURCE_TABLE
    (
    APPLY
    (
        'INSERT INTO DLS.L_2_txt
         values (:COL1, :COL2, :COL3);'
    )
 
   TO OPERATOR (SQL_INSERTER [1])
 
   SELECT * FROM OPERATOR (FILE_READER ());
);
);

Forums: 

ODBC error while while loading data using SSIS

$
0
0

Hi,
I am trying to load a delimited flat file with " as qualifier. We use SSIS 2008 and using  attinuity connector for teradata destination.
The load seems to be running but in the end it fails with the below error and it doesn't commit any records :
 

[Teradata Destination [698]] Error: There was an ODBC error while trying to read error tables. Invalid TPT operator.

 

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Teradata Destination" (698) failed with error code 0x80004005 while processing input "Teradata Destination Input" (721). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

 

Can anyone suggest what can be reason for this. I tried to increase the error counts as well but it doesn't work.
Any help would be appreciated.
 

Forums: 

Change Password Option in Teradata SQL Assitant

$
0
0

Hello Everyone,
I tried to change my password through SQL Assistant 12.0 Change Password option.  But the 'Ok' button is not enabled.
I came to know that the 'Ok' button will be enabled only if Current Password is correct and the new passord in Two New Password Fields match exactly. Untill then the button will not be highlighted and even an error message was not thrown by the tool.
But when I logged in to a remote server using ODBC-ldap, using the same SQL Assistant tool, it is showing single/same error message atleast even if the Current Password is not correct, or it does not meet the New Password Requirements.
 
Can anyone let me know why
a) Error message is displayed when tried to change password by connecting to remote server using ODBC-LDAP and
b) Error message is Not displayed when tried to change password by connecting to local server using ODBC without any authentication mechanism.
c) Is there any option need to be set on servers for this message display
 
Thanks in advance!
Syamala Meduri

Forums: 

Problem with FastExport and DECIMAL column on TD 14.10

$
0
0

Hello,
I am experiencing an error while trying to do a FastExport with data from the TD Server to a file - the error message is UTY8713 RDBMS failure, 2617: Overflow occurred computing an expression involving MV_FAK_VMON_DTV_ALL.DTV
The column in question is DECIMAL(38,30) and has that weird format because the whole warehouse has been ported from an Oracle DB without real rethinking :-(. I can export the table without problems if I manually exclude every column with the format DECIMAL(38,30). Also - if I do a CAST to CHAR(38) the data gets exported without problems so there seems to be no problem with the table itself, but more with the data format and the FastExport utility.
Unfortunately I cannot manually override the SELECT statement of that export as the export is part of a script that automatically exports all tables within a certain control table (so I don't know which tables will be target for exporting beforehand) and gets created automatically.
Our database is TD 14.10 with FastExport Utility (FEXP.14.10.00.06) running on a Linux plattform. Server has all the latest patches installed.
Has ever anybody run into such a problem or can somebody give a hint to the root cause of the problem ?
As the warehouse (that table is in) is operational and I only have limited access to the ODI scripts doing the ETL I probably will have to live with the table structure, but need to export the data somehow. I have chosen to use FastExport as I can automatically tell it to generate a MultiLoad script to import the exported data on a remote server (also a TD machine).
Any help is highly appreciated
Best regards,
 
SOS

Forums: 

Fastload Inserting Default User and Date

$
0
0

I have the following fastload script

CREATE MULTISET TABLE "TABLE" ( FIELD1              VARCHAR(25) CHARACTER SET UNICODE CASESPECIFIC ,

                                               FIELD2                  VARCHAR(25) CHARACTER SET UNICODE CASESPECIFIC ,

                                               FIELD3                  VARCHAR(25) CHARACTER SET UNICODE CASESPECIFIC ,

                                               FIELD4                  VARCHAR(25) CHARACTER SET UNICODE CASESPECIFIC ,

                                               LOAD_USER                  VARCHAR(20) CHARACTER SET LATIN CASESPECIFIC DEFAULT USER,

                                               LOAD_DATE                  TIMESTAMP(6) DEFAULT CURRENT_TIMESTAMP(6) ) ;

BEGIN LOADING "TABLE"

    ERRORFILES TABLE_errors1, TABLE_errors2

    INDICATORS ;

AXSMOD Oledb_Axsmod "noprompt jobid=1";

DEFINE FIELD1             (VARCHAR(765)),

       FIELD2                  (VARCHAR(765)),

       FIELD3           (VARCHAR(765)),

       FIELD4          (VARCHAR(765)),

       LOAD_USER                  (VARCHAR(765)),

       LOAD_DATE                  (DATE), FILE=Untitled ;

INSERT INTO "TABLE" ( FIELD1, FIELD2, 

                                     FIELD3, FIELD4,

                                     LOAD_USER, LOAD_DATE )

    VALUES ( :MITS_CLAIM_ID, :PLAN_CODE, :MBR_MEDICAID_NBR, 

             :DTE_FIRST_SVC_HDR, :DEFAULT, :DEFAULT ) ;

 

But it crashes, saying DEFAULT and DEFAULT  are not defined.  I've tried variations, including USER(), USER, DATE, etc, but they always come back as "not defined".  And if I just leave the fields out of my VALUES statement, I receive the error message that my column name list longer than value list.  I'm trying to default the current user and date.  I've tried every variation I cna think of, including trying to "hardcode""USER" instead of :DEFAULT, for example, with no luck.

Tags: 
Forums: 

Arcmain15 unable to read archive from TD14

$
0
0

I have two Teradata Express VMs side by side...
   The 1st  is Release: TD14.00.00.02 Ver TD14.00.00.14
   The 2nd is Release: TD15.00.01.01 Ver TD15.00.01.01
... and I want to move the a set of databases from v14 to v15. 
Just as in the past I planned to use arcmain to move the databases.  Here's an extract from the scripts:

 

export:

    LOGON TERADATA14/DBC,DBC;

     archive data tables (CIM_APPLICATION)ALL,

     release lock,

     file = CIM_APP;

     LOGOFF;

 

import:

     LOGON TERADATA15/dbc,dbc;

     copy DATA tables (CIM_APPLICATION),

     RELEASE LOCK,

     file=CIM_APP;

     LOGOFF;

 

The export works fine and so, too, does the import if I restore the data back to TD14.

However, I get the following error if I try and restore the data to TD15...

 

02/11/2015 16:09:09  ARC HAS REQUESTED 4 SESSIONS, TASM HAS GRANTED IT 4 SESSIONS
02/11/2015 16:09:09
02/11/2015 16:09:09  UTILITY EVENT NUMBER  - 67
02/11/2015 16:09:10  LOGGED ON    4 SESSIONS
02/11/2015 16:09:10  *** Failure ARC0805:Access Module returned error code 29: Empty file on read open.
02/11/2015 16:09:10  LOGGED OFF   7 SESSIONS
02/11/2015 16:09:10  ARCMAIN TERMINATED WITH SEVERITY 12

What am I missing that preventing the data from loading into TD15?

Any help would be appreciated

 

Forums: 

Unable to load the saved database tree information

$
0
0

Some times I get the error "unable to load the saved database tree information" when I open teradata sql assistant and I loose my saved database tree. My sql assistant version 14.0.0.1
How to solve this problem?
-Raghu

Forums: 

Teradata warehouser miner.

$
0
0

Hi Team,
 
I have downloaded twm tool and teradata database.I am not able to connect this tool to teradata database.
Please tell me the step to connect it to teradata database. So i can write query and procedures .
 
And please tell me how to start teratada database.
Regards,
Vishnu Rathore

Forums: 

Failed to obtain shared memory segment for for column data buffer, due to error 48.

$
0
0

I'm using Teradata Tools and Utilities 15, TPT and the ODBC operator to move data from SQL Server to Teradata.  Every once in a while I get this error and I don't know why.  If anyone has any experience with this error is would be appreciated.

 

Services|ODBCOperator: TPT17187: Failed to obtain shared memory segment for for column data buffer, due to error 48.

Services|ODBCOperator: TPT17174: Error 0 allocating memory for row size buffer

Services|ODBCOperator: disconnecting sessions

Forums: 

Fileds shifted in the database using TPT

$
0
0

Hello everybody,
I have a huge problem loading data from a flat file to Teradata DB.
What happen is that the data loads correctely in the DB except some rows which their fields contain a string with the letter "é" i tried to switch the "é" to the letter "e" and it works fine.
Please could anyone tells me how to solve this problem. Its realy urgent.
Thank you.

Forums: 

TPT Operator Template error "Output Schema does not match data from SELECT statement"

$
0
0

Hello !!
I am getting "TPT12108: Output Schema does not match data from SELECT statement" error when I am using a TPT script (operator templates) to copy a Table from TDSERVER-A to TDSERVER-B, . This script usually runs fine. I am observing error this time bcause the table has TIMESTAMP(3). Can someone take a look at the script/error and tell me if I should make adjustments or there are limitations with operator templates in 13.10 ? 
TPT Version:

---------------

 

Teradata Parallel Transporter Executor Version 13.10.00.10

Teradata Parallel Transporter Coordinator Version 13.10.00.10

Teradata Parallel Transporter Executor Version 13.10.00.10

Teradata Parallel Transporter Load Operator Version 13.10.00.04

Teradata Parallel Transporter Export Operator Version 13.10.00.06

 

DDL of Table being copied:

---------------------------------

CREATE MULTISET TABLE SANDBOX.TEST_TABLEA ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      PKey VARCHAR(16) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL,

      BusinessModifiedTS TIMESTAMP(3) FORMAT 'YYYY-MM-DDbHH:MI:SS.S(3)',

      BusinessModifiedDT DATE FORMAT 'YYYY-MM-DD'

      )

PRIMARY INDEX ( PKey );

 

TPT Script:

-------------

 

DEFINE JOB load_source_to_target_table

DESCRIPTION 'This job is to export source table and load to target table using operator templates'

(

STEP STEP_LOAD

(

APPLY $INSERT

TO OPERATOR( 

$LOAD [@LoadInstances]

ATTR

(

PrivateLogName = @TargetTable || '.load.log'

)

)

SELECT *

FROM OPERATOR

(

$EXPORT[@ExportInstances]

ATTR

(

PrivateLogName = @TargetTable || '.export.log'

)

);

);

);

 

Job Variables File:

----------------------

SourceTdpId = 'TDQA',

SourceUserName ='tXXXXX',

SourceUserPassword='XXXXXX',

SourceDBTable='SANDBOX.TEST_TABLEA',

SelectStmt='select * from ' || @SourceDBTable ||' ;', 

ExportInstances=1,

SourceMaxSessions=40,

/*The following variables should be initialzied with target database server/user/table details */

TargetTdpId = 'TDDEV',

TargetUserName ='tXXXXX',

TargetUserPassword='XXXXX',

TargetDatabase='two_week_space',

TargetTable='TEST_TABLEA',

TargetWorkingDatabase='SANDBOX',

ExportTraceLevel='All',

LogTable = '' || @TargetWorkingDatabase || '.' || @TargetTable || 'lg',

LoadInstances=1,

TargetMaxSessions=40

 

 

TPT Output:

--------------

Teradata Parallel Transporter Version 13.10.00.10

Job log: /opt/teradata/client/13.10/tbuild/logs/tXXXXX-2408.out

Job id is tXXXXX-2408, running on XXXXXX

Teradata Parallel Transporter Load Operator Version 13.10.00.04

$LOAD: private log specified: TEST_TABLEA.load.log

Teradata Parallel Transporter Export Operator Version 13.10.00.06

$EXPORT: private log specified: TEST_TABLEA.export.log-1

$LOAD: connecting sessions

$EXPORT: connecting sessions

$EXPORT: TPT12108: Output Schema does not match data from SELECT statement

$EXPORT: disconnecting sessions

$EXPORT: Total processor time used = '0.119924 Second(s)'

$EXPORT: Start : Tue Feb 17 13:23:05 2015

$EXPORT: End   : Tue Feb 17 13:23:06 2015

$LOAD: preparing target table

$LOAD: entering Acquisition Phase

$LOAD: disconnecting sessions

$LOAD: Total processor time used = '0.630907 Second(s)'

$LOAD: Start : Tue Feb 17 13:23:05 2015

$LOAD: End   : Tue Feb 17 13:23:11 2015

Job step STEP_LOAD terminated (status 12)

Job tXXXXX terminated (status 12)

 

Thanks !!

Forums: 

Issue with TPT named pipes on windows

$
0
0

hi all,
I am having a problem getting the named pipes TPT to complete. It appears to take several checkpoints and runs for a while (i am trying to copy apprx 18M rows). It appears to be hanging and the target table is 'being loaded' when I try to get a row count. Any ideas are welcomed....

C:\Windows\System32>tlogview -j Contact_Email_Ct_pipe_20150216_111826-12879

TPT_INFRA: TPT04101: Warning: TMSM failed to initialize

Teradata Parallel Transporter Executor Version 13.10.00.02

Teradata Parallel Transporter Coordinator Version 13.10.00.02

Teradata Parallel Transporter Executor Version 13.10.00.02

Teradata Parallel Transporter Executor Version 13.10.00.02

Teradata Parallel Transporter Executor Version 13.10.00.02

Teradata Parallel Transporter Executor Version 13.10.00.02

Teradata Parallel Transporter Executor Version 13.10.00.02

ACCESS_MODULE_READER: TPT19206 Attribute 'TraceLevel' value reset to 'MILESTONES

'.

ACCESS_MODULE_READER: TPT19206 Attribute 'TraceLevel' value reset to 'MILESTONES

'.

Teradata Parallel Transporter DataConnector Version 13.10.00.02

ACCESS_MODULE_READER Instance 2 directing private log report to 'dataconnector_log-2'.

ACCESS_MODULE_READER Instance 3 directing private log report to 'dataconnector_log-3'.

ACCESS_MODULE_READER: TPT19206 Attribute 'TraceLevel' value reset to 'MILESTONES'.

ACCESS_MODULE_READER Instance 1 directing private log report to 'dataconnector_log-1'.

ACCESS_MODULE_READER: TPT19008 DataConnector Producer operator Instances: 3

Teradata Parallel Transporter Load Operator Version 13.10.00.02

Contact_Email_CT_loadP1: private log specified: Contact_Email_CT_strm_log-1

ACCESS_MODULE_READER: TPT19003 ECI operator ID: ACCESS_MODULE_READER-4240

ACCESS_MODULE_READER: TPT19012 No files assigned to instance 2.  This instance will be inactive.

ACCESS_MODULE_READER: TPT19012 No files assigned to instance 3.  This instance will be inactive.

Contact_Email_CT_loadP1: connecting sessions

ACCESS_MODULE_READER: TPT19222 Operator instance 1 processing file '\\.\pipe\npp1'.

Contact_Email_CT_loadP1: preparing target table

Contact_Email_CT_loadP1: entering Acquisition Phase

Job is running in Buffer Mode

Task(APPLY_1[0001]): checkpoint completed, status = Success

Task(APPLY_1[0003]): checkpoint completed, status = Success

Task(SELECT_2[0002]): checkpoint completed, status = Success

Task(SELECT_2[0003]): checkpoint completed, status = Success

Task(APPLY_1[0002]): checkpoint completed, status = Success

Task(SELECT_2[0001]): checkpoint completed, status = Success

 

C:\Windows\System32>

 

Also, why do I get ACCESS_MODULE_READER: TPT19012 No files assigned to instance 2.  This instance will be inactive.

ACCESS_MODULE_READER: TPT19012 No files assigned to instance 3.  This instance will be inactive.

Forums: 

td-ttu-15.00_for_Windows correctly installed but now how do i create a database

$
0
0

td-ttu-15.00_for_Windows correctly installed but now,  how do i create a database? If i tried to reach to my office teradata server it fails i guess that would be some kind of firewall does not it? so if I am not able to reach or download a demo database in which practice the basic comands how can i create my own database for first time?
 
thanks in advance
  

Forums: 

SQL Assistant Feature Request Location

SQL Assistant 15.0 - make row headers selection permanent

$
0
0

Is there a way to make the 'Row Headers' setting permanent and applicable to all answer sets in SQL Assistant v15?
I recently upgraded from SQL Assistant 12 to 15 and I noticed that you can turn off Row Headers only on a per results set basis and the setting is not permanent.  There is a toolbar button for 'Row Headers' - turns row numbering on or off.  Alternatively, you can right click on each answer set and select 'Row Headers'.
Frequently, I copy/paste my answer sets into Excel and I don't want those row numbers to carry over as that just adds an extra step to my process.

Forums: 

BTEQ field separator.

$
0
0

Hi,

In my BTEQ statement, I'm selecting two columns of a table and using this statement .SET SEPARATOR "|" and I'm exporting the output to local file. When I check that file only one column is present. Please help me on this.

Thanks in advance,
Vinay.

Forums: 

TPT Loading

$
0
0

Hi,
    I'm trying to load data from External delimitted file using TPT. One of my columns in the  delimitted file contains the as '24FEB2015'(DDMONYYYY) format. I need to insert this value into Date column, but which is not possible. Please help me on this.
 
Thanks,
Vinay.

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>