Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

TTU (linux) silent installation or chef recipe

$
0
0

Does anyone know whether it is possible to install TTU silently (non-interactive) on Linux? Or, has anyone created a chef recipe for it? Thanks.

Tags: 
Forums: 

Error 9128 : Transaction exceeds max. number of Row Hash locks

$
0
0

Hello All,
I am trying to load a file with aroun 12 million records to an empty table and I tried using BTEQ.
I am included begin transaction & end transaction statements in my BTEQ import script.
However,  the script after sometime stating "Error 9128: Transaction exceeds max. number of Row Hash locks".
Can someone help me understand what exactly is the issue and what is RowHash lock?
 
 

Forums: 

Fast load Issue

$
0
0

Hi experts, this is the error I got while executing the fastload script
0002 .logon 127.0.0.1/dbc,

**** 16:43:30 Teradata Database Release: 14.10.01.02

**** 16:43:30 Teradata Database Version: 14.10.01.01

**** 16:43:30 Number of AMPs available: 2

**** 16:43:30 Current CLI or RDBMS allows maximum row size: 64K

**** 16:43:30 Character set for this job: ASCII

 

0003 set record vartext ",";

 

**** 16:43:30 Now set to read 'Variable-Length Text' records

**** 16:43:30 Delimiter character(s) is set to ','

**** 16:43:30 Command completed successfully

 

0004 begin loading retail.employee;

 

**** 16:43:30 Number of FastLoad sessions requested = 4

**** 16:43:30 Number of FastLoad sessions connected = 2

**** 16:43:30 FDL4808 LOGON successful

**** 16:43:30 RDBMS error 3706: Syntax error: expected something

              between the word 'employee' and the 'WITH' keyword.

     ===================================================================

     =                                                                 =

     =          Logoff/Disconnect                                      =

     =                                                                 =

     ===================================================================

 

**** 16:43:30 Logging off all sessions

**** 16:43:30 Total processor time used = '0.27 Seconds'

     .        Start : Sat Nov 21 16:43:30 2015

     .        End   : Sat Nov 21 16:43:30 2015

     .        Highest return code encountered = '12'.

**** 16:43:30 FDL4818 FastLoad Terminated

TDExpress14.10.01_Sles10:~ #

 

The script that i used is as follows

 

.sessions 4;

.logon 127.0.0.1/dbc,dbc;

set record vartext ",";

begin loading retail.employee;

 

define 

Empno (varchar(15))

Empname (varchar(18))

Address (varchar(40))

Phone (varchar(15))

Deptno (varchar(2))

Salary (varchar(12))

YrsExp (varchar(1))

DOB (varchar(10))

Medstat (varchar(1))

Edllev (varchar(1))

Note (varchar(79))

 

File = Incoming.txt;

Insert into retail.employee

values(

:Empno,

:Empname,

:Address,

:Phone,

:Deptno,

:Salary,

:YrsExp,

:DOB,

:Medstat,

:Edllev,

:Note

);

end loading;

.logoff;

 

 

can somebody please help me to figure out the mistake.. please.. please.. please..

 

 

Forums: 

Define schema to load file source with additional derived columns

$
0
0

I am using TPT 14.10.05 on AIX.  I want to read a file and add some fields to
the row from job variables before loading to a table.

I am able to do this by creating the table within the TPT script and setting the
column defaults with the job variables,
but I can't add the derived columns in the producer output.
The TPT manuals say derived fields can be added to the SELECT statement of a
file read producer.  However, I can't seem to define/infer correct producer schema for reading the
file (without derived field) and then a different consumer schema (with derived field) for the table.

Below is a summary of the main objects (the site only allows uploads of images):

CREATE MULTISET TABLE DEV4_TEMP.W_TEST_LOADFILE,
 NO FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT, DEFAULT MERGEBLOCKRATIO (
FILE_INSTANCE_ID INTEGER DEFAULT NULL,
RECORD_TYPE VARCHAR(200) CHARACTER SET LATIN NOT CASESPECIFIC DEFAULT NULL)
NO PRIMARY INDEX;

-- File contains one VARCHAR column that matches table definition.
REPLACE VIEW DEV4_TEMP.W_TEST_READFILE AS
 LOCKING ROW FOR ACCESS SELECT
RECORD_TYPE
from DEV4_TEMP.W_TEST_LOADFILE;

-- Want to insert the column from the file and an additional column from a variable.
REPLACE VIEW DEV4_TEMP.W_TEST_INSERT AS
 LOCKING ROW FOR ACCESS SELECT
FILE_INSTANCE_ID,RECORD_TYPE
from DEV4_TEMP.W_TEST_LOADFILE;

STEP LOAD_WT2
(
  DEFINE SCHEMA LOAD_SCHEMA FROM TABLE DELIMITED @LoadView;

  APPLY @WorkTableInsert2 TO OPERATOR ($INSERTER())
 ,APPLY TO OPERATOR ($SCHEMAP ATTR( PrivateLogName = 'imp_SCHM', DumpFieldsWithTrans='Y', AllRecords='Yes', RecordCount=10))
  SELECT @FileInstance as FILE_INSTANCE_ID,RECORD_TYPE FROM OPERATOR($FILE_READER()) WHERE RECORD_TYPE=@RecordType;
);

--VARIABLES FILE
/* GLOBALS */
FileReaderFormat = 'Delimited'
,OpenMode = 'Read'
,FileReaderIndicatorMode='N'
,FileReaderTrimColumns='None'
,FileReaderAcceptExcessColumns='N'
,FileReaderAcceptMissingColumns='N'
,FileReaderTruncateColumnData='N'
,EnableScan='N'
,FileReaderRecordErrorFileName='./BadRecs_'||@FileName
,FileReaderRecordErrorVerbosity='High'
,FileReaderPrivateLogName = 'imp_FR'
,InserterPrivateLogName = 'imp_INS'
,DDLPrivateLogName = 'imp_DDL'
,FileReaderSkipRowsEveryFile='Y'
,FileReaderNullColumns='Y'
,SourceFormat='Delimited'
,RecordErrorFilePrefix='BadRecs_'

/* BATCH SPECIFIC */
,FileInstance='1'
,RecordType='Invoice'
,FileReaderSkipRows=0
,FileReaderTextDelimiter='TAB'
,TargetWorkingDatabase = 'DEV4_TEMP'
,TargetTable = 'W_TEST_LOADFILE'
,WorkTableLoad = @TargetWorkingDatabase||'.'||@TargetTable
,FileView = @TargetWorkingDatabase||'.W_TEST_READFILE'
,LoadView = @TargetWorkingDatabase||'.W_TEST_INSERT'
,WorkTableInsert2='INSERT INTO '||@LoadView||' (FILE_INSTANCE_ID, RECORD_TYPE) VALUES (:FILE_INSTANCE_ID, :RECORD_TYPE);'
,WorkTableCreate='CREATE MULTISET TABLE '||@WorkTableLoad||',
 NO FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT, DEFAULT MERGEBLOCKRATIO (
FILE_INSTANCE_ID INTEGER DEFAULT '||@FileInstance||',
RECORD_TYPE VARCHAR(200) CHARACTER SET LATIN NOT CASESPECIFIC DEFAULT '''||@RecordType||'''
)NO PRIMARY INDEX;'
 

Forums: 

Copy data from TD15.00 to TD15.10 (TD Express for VMWare)

$
0
0

Hi,
I just tried to move data between TD 15.00.00.02 and TD 15.10.00.06 (both TD Express for VMWare). I'm using Arcmain from TTU 15 (arcmain 15.00.00.01). The archive runs perfect, but when copying I get: 11/25/2015 22:35:25  *** Failure ARC0010:The DBS Software Version is incompatible.
How can I resolve this? Do I need to use arcmain and libs from TTU15.10 for both the archive and restore/copy or is there something else.
And, is TTU 15.10 available for download somewhere? I cannot find it here.
Kind regards,
Toto

Tags: 
Forums: 

Bteq - Maintain Audit Info

$
0
0

ALL,

HAPPY THANKSGIVING DAY!!!!
 
Need help on BTEQ!!! In our project our client asked us to maintain all audit information bteq load.
 
We need to maintain source and target row processed count - how many rows processed ? count for source rows? count for target rows? This count should get inserted in audit table.[Please suggest query] - I can get row count using activity_count in procedure but i dont want to user proc.
Also need to maintain status of job. If bteq is running then 'Running' status should get inserted into table, once completed/ failed, status  should get updated accordingly.[Please suggest query]
Also need to maintain Error handling if bteq fails.[Please suggest query]
We have to write this bteq in UNIX script
Please suggest best approch to work on such requirement.
 
Thanks,
Tushar

Forums: 

TPT API Error while using with INFA

$
0
0

Hi Folks,
Please help me to understand the below  TPT API Error, while using the TPT Stream connection with Informatica Tool as same job works fine with the ODBC connection to the same database and table.
Environment Details:
Informatica version: 9.5.1
Teradata Database version: 14.00
TTU Version: 13.10
Error Details:
[ERROR] Type:(Teradata PT API Error), Error: (libstreamop.so) instance(1): INITIATE method failed with status = Fatal Error)
[ERROR] Plug-in failed to Initiate Teradata database connection.
 
Please find below the table definition for reference:

CREATE MULTISET TABLE XXXX.XXXXXX ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      TRAN_DATE TIMESTAMP(0) NOT NULL,

      TRAN_TIME VARCHAR(6) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      SERN VARCHAR(8) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      DBIT_ACCT_NUMB VARCHAR(12) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      CRDT_ACCT_NUMB VARCHAR(19) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      MONTH_KEY VARCHAR(6) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      PERIOD_ID VARCHAR(8) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      BRCH_CODE VARCHAR(6) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      TELL_ID VARCHAR(8) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      SYST_PDCT_TYPE VARCHAR(2) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      TRAN_TYPE VARCHAR(2) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      CASH_AMT DECIMAL(15,2) NOT NULL,

      TRAN_AMT DECIMAL(15,2) NOT NULL,

      TRAN_INDC VARCHAR(1) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      TRAN_CODE VARCHAR(3) CHARACTER SET LATIN CASESPECIFIC NOT NULL,

      ITEM_COUNT DECIMAL(7,0) NOT NULL,

      CLER_CHEQ_INDC VARCHAR(1) CHARACTER SET LATIN CASESPECIFIC,

      ACTL_CASH_AMT DECIMAL(15,2),

      TRNF_AMT DECIMAL(15,2))

PRIMARY INDEX ( SERN ,DBIT_ACCT_NUMB ,CRDT_ACCT_NUMB )

PARTITION BY RANGE_N(MONTH_KEY (DATE, FORMAT 'yyyymm') BETWEEN DATE '2014-05-01' AND DATE '2015-08-31' EACH INTERVAL '1' MONTH )

UNIQUE INDEX ( TRAN_DATE ,TRAN_TIME ,SERN ,DBIT_ACCT_NUMB ,CRDT_ACCT_NUMB ,

MONTH_KEY ,PERIOD_ID );

 

Please let me know if any more details are required to analyze the error. Thanks.

Forums: 

Regex function REGEXP_REPLACE not accepting parameter

$
0
0

Hi All,
I want to replace all occurences of RAM with SHYAM in a column. I used for query:
UPDATE  table_name
SET col1=REGEXP_REPLACE(col1,RAM,SHYAM,1,0,'i');
Getting error: 9134, Occurences should be greater than 0. I tried on DEV box, it is working fine. However on PROD box, this is failing.
Please help.

Forums: 

TPT05015: Error: Cannot open the file '$SCHEMA_GEN_D_TBL001.txt' due to error 13. Permission denied

$
0
0

Hi.  I am executing `tbuild` with a job script that makes use of templates to delete from a staging table and load data into that table from a delimited text file.  When I execute this command logged in as my user it succeeds.  However, when I execute it logged in as someone else, it fails with:

Teradata Parallel Transporter Version 14.10.00.00

TPT_INFRA: TPT04032: Error: Schema generation failed for table 'MY_STG_SCHEMA.MY_STG_TABLE' in DBS 'foo':

  "GetDelimitedFileSchema" status: 48.

 

Job script preprocessing failed.

TPT_INFRA: TPT05015: Error: Cannot open the file '$SCHEMA_GEN_D_TBL001.txt' due to error 13.

                     Permission denied

Job terminated with status 12.

 

 
The error message is clear enough:  logged on as the other user, tbuild does not have permission to view a schema file.  But why?  Isn't tbuild creating the file and reading it in the same session?
 
Here is the command I am using:
tbuild -f /home/myusername/Delete_Table_Then_Load_File.txt -u TargetTdpId = 'foo', TargetUserName = 'bar', TargetUserPassword = 'baz', TargetTable = 'MY_STG_SCHEMA.MY_STG_TABLE', FileName = '/home/myusername/my_data_file.dat', DCPFileList = 'N', Format = 'Delimited', LogTable = 'MY_STG_SCHEMA.MY_STG_TABLE_L', DeleterLogTable = 'MY_STG_SCHEMA.MY_STG_TABLE_L', SourceFormat = 'Delimited', DCPSkipRows = 1, DCPSkipRowsEveryFile = 'Y', DCPOpenMode = 'Read', DCPTextDelimiter = '|', DCPQuotedData = 'Optional', DCPOpenQuoteMark = '"', DCPCloseQuoteMark = '"', LoadInstances = 1, DCPInstances = 1 TPT_20151130_153443
 
Here is /home/myusername/Delete_Table_Then_Load_File.txt:

DEFINE JOB Delete_Table_Then_Load_File

DESCRIPTION 'Delete data from a staging table and then load data from a file'

(

  STEP Delete_Table (

    APPLY ('DELETE FROM ' || @TargetTable)

    TO OPERATOR ($DELETER);

  );

 

  STEP Load_File (

    APPLY $INSERT TO OPERATOR ($LOAD()[@LoadInstances])

    SELECT * FROM OPERATOR ($DATACONNECTOR_PRODUCER()[@DCPInstances]);

  );

);

Forums: 

TPT05015: Error: Cannot open the file '$SCHEMA_GEN_D_TBL001.txt' due to error 13. Permission denied

$
0
0

Hi.  I am executing `tbuild` with a job script that makes use of templates to delete from a staging table and load data into that table from a delimited text file.  When I execute this command logged in as my user it succeeds.  However, when I execute it logged in as someone else, it fails with:

Teradata Parallel Transporter Version 14.10.00.00

TPT_INFRA: TPT04032: Error: Schema generation failed for table 'MY_STG_SCHEMA.MY_STG_TABLE' in DBS 'foo':

  "GetDelimitedFileSchema" status: 48.

 

Job script preprocessing failed.

TPT_INFRA: TPT05015: Error: Cannot open the file '$SCHEMA_GEN_D_TBL001.txt' due to error 13.

                     Permission denied

Job terminated with status 12.

 

 
The error message is clear enough:  logged on as the other user, tbuild does not have permission to view a schema file.  But why?  Isn't tbuild creating the file and reading it in the same session?
 
Here is the command I am using:
tbuild -f /home/myusername/Delete_Table_Then_Load_File.txt -u TargetTdpId = 'foo', TargetUserName = 'bar', TargetUserPassword = 'baz', TargetTable = 'MY_STG_SCHEMA.MY_STG_TABLE', FileName = '/home/myusername/my_data_file.dat', DCPFileList = 'N', Format = 'Delimited', LogTable = 'MY_STG_SCHEMA.MY_STG_TABLE_L', DeleterLogTable = 'MY_STG_SCHEMA.MY_STG_TABLE_L', SourceFormat = 'Delimited', DCPSkipRows = 1, DCPSkipRowsEveryFile = 'Y', DCPOpenMode = 'Read', DCPTextDelimiter = '|', DCPQuotedData = 'Optional', DCPOpenQuoteMark = '"', DCPCloseQuoteMark = '"', LoadInstances = 1, DCPInstances = 1 TPT_20151130_153443
 
Here is /home/myusername/Delete_Table_Then_Load_File.txt:

DEFINE JOB Delete_Table_Then_Load_File

DESCRIPTION 'Delete data from a staging table and then load data from a file'

(

  STEP Delete_Table (

    APPLY ('DELETE FROM ' || @TargetTable)

    TO OPERATOR ($DELETER);

  );

 

  STEP Load_File (

    APPLY $INSERT TO OPERATOR ($LOAD()[@LoadInstances])

    SELECT * FROM OPERATOR ($DATACONNECTOR_PRODUCER()[@DCPInstances]);

  );

);

Forums: 

TPT 15.10 templates

$
0
0

Hi, it appears that the template files provided with TPT 15.10 differ from those with TPT 15.00; i.e. $FILE_READER used to have variables like @FileReaderFileName, and now they are like @DCPFileName.  I couldn't find a reference to this change in the release notes; is this just a change to the naming standards, or is there some other reason for this - should I expect these to change between releases?  This would be helpful to know for testing purposes.

Forums: 

Record rejected to error table if date feild is enclosed with double quotes

$
0
0

Hi,
 
I am trying to load below two records using TPT( version 15.10.00.00) and able to load succusfully second record but first record is going to error table. The same records gets loaded in TPT version 14.00.00.08 .
File :
"1194","N",,"20150422","","20150422"
"1195","N",,"20150423","","20150423"
 
This is comma delimitted file and 4th,5th,6th places are date field
 
i am already usind attribute
Quaoteddata='Optional"
OpenQuoteMark="""
CloseQuoteMark="""
Nullcolumns='Y'
 
I suscept this problem may be because of version change as same file and same code is able to load the file to table correctely when TPT 14.00.00.08 is used and same has issue when using TPT 15.10.00.00
 
Any input on this woul be of great help
 
Thanks
Muzammil
 

Forums: 

ST_GEOGRAPHY DATA TYPE LOADING POSTGRESQL TIGER DATA INTO TERADATA (FILE BASED)

$
0
0

Here is the Goal, to migrate TIGER tables from an existing POSTGRES DB INTO A Teradata Database.  Networking constraints to not allow for direct DB to DB connectivity so some sort of file movement would be required to support this data migration.
 
1.  Some of the TIGER data is stored using simple common data elements (Characters and Numbers) so migarting those tables into Teradata is not a problem, data can be extracted using Postgessql "COPY" commend, sending data to files and then loading files into tables using Mload or Fastload.
2. Some of the TIGER data has geospatial data elements, these data elements could also be sent to a file using the "Copy" command, however, I have no idea if Teradata would be able to read data formatted in that manor.
 
So, how should I extract the data from a Postges DB?  (One File Per Table)  What tool or format is required to ensure the entire table is in a format a teradata load utilitiy can read and load?
What is the correct coresponding load utility to then load the data into an ST_GEOGRAPHY Data type?
 
Any examples would be appreciated
 
Thanks

Forums: 

LOADING GEOGRAPHY DATA INTO TERADATA

$
0
0

Has anyone recently successfully loaded GeoSpatial map data into Teradata?  I am sourcing shape files *.shp from a TIGER Database.  I have attempted to use Teradata's TDGeoImport ultility.  However, this tool is far from being prime time ready, after jumping though hoop after hoop of setting this that and the other in path statements etc I am still no further along then when I started.
So, is there a good reliable way to load geographic data into Teradata?  Possibly a third party tool?  Can you use TPT?  If so do you have to first load the data as a CLOB and convert it?
Kinda at the ropes end here, I feel like Teradata dumped some rather questionable capabilities on thier database and then decided to ignore the GeoSpatial segment back in 2012. 

Forums: 

New to BTEQ and Teradat Question on Export Report

$
0
0

I am running the following report script:
 
.LOGON xx.x.x.xx/xxxx,xxxx
.SET Titledashes OFF                        ;
.SET Quiet OFF                              ;
.SET Retry OFF                              ;
.SET Width 5000                             ;
.EXPORT REPORT FILE = yy.txt                ;
.SET format OFF                             ;
.SET recordmode OFF                         ;
.SET sidetitles OFF                         ;
SELECT   F3a || F1a || F2a || F4a (TITLE '')
       , TRIM(  d1 ) (TITLE '')
       , TRIM(  d2 ) (TITLE '')
       , TRIM(  d3 ) (TITLE '')
       , TRIM(  d4 ) (TITLE '')
       , d5 (FORMAT 'ZZZZ9.99')(TITLE '')
       , TRIM(  d6 ) (TITLE '')
       , TRIM(  d7 ) (TITLE '')
d1-d7 should all be numeric
I am trying to understand why    TRIM(  d1 ) (TITLE '') works and why    d1  (TITLE '') does not work?
 
Also the entire clause after d5 does not work
 
Finally, do I really have to "roll my own" and manually add \t (tabs) and concatenate to get tab delimited?

 

Forums: 

Record Rejected to Error Table if String contains untranslatable character while doing TPT extract/Load in BINARY mode

How to identify erroneous records within error tables

$
0
0

I'm inserting data into table with error table  (INSERT ... SELECT ... LOGGING ALL ERRORS).
It works fine although I cannot easily identify errneous records. All I have are:
- rowID
- tableID
- fieldID
- errorcode
The batural way would be to identify record by rowID. It's unavailable now though. 
 
I appreciate any advice on how these records may be identified. 
 
 

Forums: 

Setting TPT Operator from a variable

$
0
0

The 14.10 manual says the TPT operator can be set from a variable but does not provide the syntax.
Can anyone provide an example of the syntax?  Does this only work for defined operators or can the templates (e.g., $LOAD, $STREAM, $INSERTER) be passed in a variable also?
APPLY $INSERT TO OPERATOR ($INSERTER()) SELECT * FROM OPERATOR ($FILE_READER());
works, but these do not:
(pass Op='$INSERTER') APPLY $INSERT TO OPERATOR (@Op()) SELECT * FROM OPERATOR($FILE_READER());
(pass Op='$INSERTER()') APPLY $INSERT TO OPERATOR (@Op) SELECT * FROM OPERATOR ($FILE_READER());
 

Forums: 

I'm new to BTEQ, trying to execute in batch mode via .bat file

$
0
0

I created a file test_it.bat and test_it.txt, when I call test_it.bat from a command prompt the BTEQ script keeps stopping and asking to enter logon info and I have the logon command specified in the test_it.txt file:
test_it.bat
echo off
cd G:\COMMON\dw\Teradata_Output
bteq test_it.txt custlog.log 2>&1
@echo off goto end
:end
@echo exit
test_it.txt
.RUN FILE = G:\COMMON\dw\Teradata_Output\logon.txt
.EXPORT DATA FILE = G:\COMMON\dw\test.csv
.SET SEPARATOR ','
SELECT * FROM HRCP_SEMANTIC_COGNOS_V.BRNCH_LOC_DIM WHERE CO_CD = 'HG';
.LOGOFF
.EXIT
 
Here is the output from the command prompt screen:
G:\COMMON\dw\Teradata_Output>test_it.bat
G:\COMMON\dw\Teradata_Output>echo off
 Teradata BTEQ 15.00.00.00 for WIN32. PID: 7900
 Copyright 1984-2014, Teradata Corporation. ALL RIGHTS RESERVED.
 Enter your logon or BTEQ command:
test_it.txt custlog.log
 *** Warning: You must log on before sending SQL requests.
 Teradata BTEQ 15.00.00.00 for WIN32. Enter your logon or BTEQ command:

.run file = g:\common\dw\teradata_output\logon.txt
.run file = g:\common\dw\teradata_output\logon.txt
 Teradata BTEQ 15.00.00.00 for WIN32. Enter your logon or BTEQ command:
.LOGON GDWP/showcase,
 *** Logon successfully completed.
 *** Teradata Database Release is 14.10.06.01
 *** Teradata Database Version is 14.10.06.01
 *** Transaction Semantics are BTET.
 *** Session Character Set Name is 'ASCII'.
 *** Total elapsed time was 2 seconds.
 BTEQ -- Enter your SQL request or BTEQ command:
 *** Warning: EOF on INPUT stream.
 BTEQ -- Enter your SQL request or BTEQ command:
.logoff
.logoff
 *** You are now logged off from the DBC.
 Teradata BTEQ 15.00.00.00 for WIN32. Enter your logon or BTEQ command:
.exit
.exit
 *** Exiting BTEQ...
 *** RC (return code) = 2
off goto end
exit
G:\COMMON\dw\Teradata_Output>
 
The command screen stops after the "*** Warning: You must log on before sending SQL requests." and prompts for me enter the logon info.  I manually execute the .RUN command ".run file = g:\common\dw\teradata_output\logon.txt" which is specified in the test_it.txt file.
Basically I'm trying to find a way to execute the test_it.txt in batch so I can schedule the job to run daily.
 

Tags: 
Forums: 

BTEQ behaviour for Decimal(10,0) and Deciamal(18,2)

$
0
0

Hi All, I am tryign to export data from teradata to unix file and facing issue with Decimal.

 

Table structure involved is 

 

CUSTOMER_IDENTIFIER DECIMAL(10,0) NOT NULL

 

CUSTOMER_BALANCEE DECIMAL(18,2) NOT NULL

 

Values :

CUSTOMER_IDENTIFIER = -229286382

CUSTOMER_BALANCE = -0.08

 

Issue 1 and solution i found :

First i was using TRIM(CAST(CUSTOMER_IDENTIFIER AS CHAR(100))) , however i was getting -229286382. as output [ dot (.) at the end ]

To eliminate this i modified above statement to  TRIM ( CAST(CUSTOMER_IDENTIFIER AS FORMAT'Z(20)9')) which gives 229286382 (No dot (.) ) but eliminate -ve(-) sign. Hence i again modifed to TRIM ( CAST(CUSTOMER_IDENTIFIER AS FORMAT'-(20)9')) which worked perfectly.

 

Issue 2 , no solution :

CUSTOMER_BALANCE is not working with any of the above formatting. values i am getting as below:

TRIM ( CAST(CUSTOMER_BALANCE AS FORMAT'Z(20)9')) = 0

TRIM(CAST(CUSTOMER_BALANCE AS CHAR(100))) = -.08  [ Notice 0 is missing ]

TRIM ( CAST(CUSTOMER_BALANCEAS FORMAT'-(20)9')) =0 

 

Can we have any single format which would handle both issue?

 

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>