Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

FASTLAOD on TTU15 - Empty file on read open

$
0
0

Hello, I just upgraded to TTU15 on a Windows 64 bit OS, and intalled all the goodies. Now I'm running a large monthly process where there are a lot of FASTEXPORTS And FASTLOADS to be done, all as BTEQ scripts run in a DOS window. FASTEXPORT no issues. Upon running the FASTLOAD, issues, I get "Empty file on read open" and it finished out with zero rows read, etc... yet the file is 5GB of pipe delimited data file.

Can some one please share with me a simple way to convert a straight forward BTEQ FASTLOAD run on Windows DOS Window to a TPT FASTLOAD that can be run the same basic way so the new TTU15 can process this data load? I'm hoping there is little change, like just syntax to the existing scripts. Many thanks in advance. Please let me know what additional detail you may need.
 
-- Begin sample script

-- BTEQ Old School FASTLOAD

SESSIONS 10 ;
ERRLIMIT 50 ;

RUN C:\some_logins_dir\xxxx_LOGON.TXT ;
SHOW VERSIONS ;

.SET RECORD TEXT

DROP TABLE database.target_table ;
DROP TABLE database.target_table_err1 ;
DROP TABLE database.target_table_err2 ;

CREATE SET TABLE database.target_table
 , NO FALLBACK
 , NO BEFORE JOURNAL
 , NO AFTER JOURNAL
 , CHECKSUM = DEFAULT
 , DEFAULT MERGEBLOCKRATIO
       (
column_name_1 VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
column_name_2 VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
column_name_3 VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC
 )
PRIMARY INDEX ( column_name_1 );

RECORD 1 ;

SET RECORD UNFORMATTED ; 

SET RECORD VARTEXT "|" NOSTOP;
DEFINE
  column_name_1 (VARCHAR(255))
, column_name_2 (VARCHAR(255))
, column_name_3 (VARCHAR(255))
FILE C:\some_dir\filename_here.TXT ;

SHOW ;

BEGIN LOADING database.target_table
ERRORFILES
 database.target_table _err1 ,
 database.target_table _err2 ;

INSERT INTO database.target_table
(
  column_name_1
, column_name_2
, column_name_3
)
VALUES
(
: column_name_1
: column_name_2
: column_name_3
) ;

END LOADING ;
  
.LOGOFF ;

-- end of sample script
 

Forums: 

How to set '0x0001' as delimeter in tdload command?

$
0
0

I want to load a flat file which is divided by 0x0001 by tdload command to Teradata. You can find my two commds below.

But they both reported "length overflow the first row" .In my opinions, I dont's set delimeter corretly.

 

tdload -f filename \

-u user -p passwd -h server -t tablename \

--TargetWorkingDatabase dbname

-d SOH

OR 

tdload  --SourceFileName  filename \

 --SourceTextDelimiter 0x0001  \

--TargetUserName user  \

--TargetUserPassword passwd \

--TargetTdpId server \

--TargetTable tablename \

--TargetWorkingDatabase dbname\

 

In addition, Teradata version 14.

 

Forums: 

How to set '0x0001' as delimeter in tdload command?

$
0
0

I want to load a flat file which is divided by 0x0001 by tdload command to Teradata. You can find my two commds below.

But they both reported "length overflow the first row" .In my opinions, I dont's set delimeter corretly.

 

tdload -f filename \

-u user -p passwd -h server -t tablename \

--TargetWorkingDatabase dbname

-d SOH

OR 

tdload  --SourceFileName  filename \

 --SourceTextDelimiter 0x0001  \

--TargetUserName user  \

--TargetUserPassword passwd \

--TargetTdpId server \

--TargetTable tablename \

--TargetWorkingDatabase dbname\

 

In addition, Teradata version 14.

 

Forums: 

Transformations in BTEQ vs TPT

$
0
0

Hi All,
Does someone have a chart that describes the pros and cons of performing Transformations in BTEQ vs TPT? To be more precise, I am looking for what can be and what cannot be done wrt transformation in BTEQ and TPT.
 
This is kind of urgent, so any help would be greatly appreciated.
 
Thank you.

Forums: 

How to use control letter as a delimiter in fastload

$
0
0

Hi All,
I want to know whether i can set Ascii Control letter as delimeter when i use Fastload to load data.
For example:
set record vartext \001
or set record vartext \007 
...
Please give me a hand.  Thank you.
 

Forums: 

TPT Error Message - Delimiter did not immediately follow close quote mark in row 109785, col 8.

$
0
0

Hi,
I get the above error message for a file.  This file has Unix-style line endings (i.e. 'LF')
However, if I go into the data (using Notepad++) and change to Windows-style line endings - i.e. 'CRLF' - then the file is imported successfully.  
While TPT says " 1 error rows sent to error file out.err" - there are actually two rows in that file, as if TPT hasn't noticed the line feed.  However, it has managed to process 109784 rows before this without issue.  
The lines in the error file look properly formatted

I am using: Teradata Parallel Transporter Version 15.10.00.03 64-Bit.

 

Any ideas?

 

 

 

Forums: 

BTEQ Problem when reporting column that has a UDF that returns varchar(80)

$
0
0

The following bteq script has two versions of the select statement.  When I run the first one I get an error "A column or character expression is larger than the max size"
The function CleanDesc returns varchar(80) and is below.  There is no problem if I return CHAR(80)
.LOGON xx/xx,xx
.SET Titledashes OFF                        ;
.SET Quiet OFF                              ;
.SET Retry OFF                              ;
.SET format OFF                             ;
.SET recordmode OFF                         ;
.SET sidetitles OFF                         ;
.SET SEPARATOR '            '                       ;
.EXPORT REPORT FILE = ProductHierarchy.txt  ;
.SET Width 5000                             ;
 
SELECT    ‘doggy’,           cleandesc('"GEN2,Products"',56)
SELECT    ‘doggy’, cast(cleandesc('"GEN2,Products"',56) as varchar(80))
 
;
 
 
 
.EXPORT RESET
.LOGOFF
.QUIT
 
 
.LOGON xx/xx,xx
.SET Titledashes OFF                        ;
.SET Quiet OFF                              ;
.SET Retry OFF                              ;
.SET format OFF                             ;
.SET recordmode OFF                         ;
.SET sidetitles OFF                         ;
.SET SEPARATOR '            '                       ;
.EXPORT REPORT FILE = ProductHierarchy.txt  ;
.SET Width 5000                             ;
 
SELECT    ‘doggy’,           cleandesc('"GEN2,Products"',56)
SELECT    ‘doggy’, cast(cleandesc('"GEN2,Products"',56) as varchar(80))
;
.EXPORT RESET
.LOGOFF
.QUIT
 
 
CREATE FUNCTION CleanDesc ( desc_to_be_cleaned VARCHAR(200), max_length INTEGER )
     RETURNS VARCHAR(80)
     LANGUAGE SQL
     DETERMINISTIC
     CONTAINS SQL
     CALLED ON NULL INPUT
     SQL SECURITY DEFINER
     COLLATION INVOKER
     INLINE TYPE 1
     RETURN SUBSTR(      CASE WHEN desc_to_be_cleaned IS NULL
                              THEN ''
                              WHEN     REGEXP_INSTR( SUBSTR( LTRIM( desc_to_be_cleaned ), 1, 1 ),'[\w/;:>?\[\]]' ) = 0
                              THEN 'X '
                              ELSE ''
                          END
                      || REGEXP_REPLACE(   REGEXP_REPLACE(   TRIM( desc_to_be_cleaned )
                                                           , '["]'
                                                           , ''''''
                                                           , 1
                                                           , 0
                                                           , 'i'
                                                         )
                                         , '[^[\w''()+,-./:;<=>?@/[/]_`| ]'
                                         , 'X'
                                         , 1
                                         , 0
                                         , 'i'
                                       )
                   , 1
                   , CASE WHEN    max_length IS NULL
                               OR max_length < 1
                               OR max_length > 80
                          THEN 80
                          ELSE max_length
                      END
                  )
--
--
--
--     legal at beginning   /;:>?[]
--
--     legal middleORend    '()+,-./:;<=>?@[]_`|
--
--
--     not legal at begining '()+,-.<=@\_|
--
;

Forums: 

Changing network configuration for VMs in VMWare Player

$
0
0

The most convenient and trouble-free network setting for VMware images is NAT. The default setting useDHCP for the  VMs meaning that when more then one VM gets started (like in UDA or when palying with Hadoop clusters) IP addresses get randomized. Creating fixed IP addresses works around this, but means that we may have to change the default VMWare palyer subent used for NAT. The script below will do this automatically, changing it to 192.168.100.x
vnetlib.exe -- stop nat
vnetlib.exe -- stop dhcp
vnetlib.exe -- set vnet vmnet8 mask  255.255.255.0
vnetlib.exe -- set vnet vmnet8 addr  192.168.100.0
vnetlib.exe -- set adapter vmnet8 addr 192.168.100.2
vnetlib.exe -- set nat vmnet8 internalipaddr 192.168.100.254
vnetlib.exe -- update dhcp vmnet8
vnetlib.exe -- update nat vmnet8
vnetlib.exe -- update adapter vmnet8
vnetlib.exe -- start dhcp
vnetlib.exe -- start nat

Forums: 

TPT Load performance

$
0
0

Hello,
What's a typical load performance/throughput one can expect  for following configuration.
 

  • 10x 100GB uncompressed files ( Total size ).
    • Please assume the above files are on a fast storage ( 2-4TB/Hour I/O throughput) attached to a server with 16 core Xeon processx 256 GB RAM.  There is 10g connectivity between this server and Teradata Appliance.
  • Teradata 2800 Appliance ( 4 node )
  • Loading using TPT to empty table with NOPI.

What should be throughput if you load the same files into 8 node - Teradata 2800 Appliance?
 
Thanks in advance for your response.
 
 
 
 
 
 

Forums: 

Teradata Python Module - SP return value truncation against CLOB data type in python. Please help

$
0
0

Hi,
   I have installed teradata python module with python 3.5.1 version. I'm calling a teradata SP that return CLOB and some VARCHAR parameters. The SP in out parameters are shown below,
TPTCLOB_Generate  (
         IN iProcess_Name            VARCHAR(30)
  ,IN iDebug_Level             BYTEINT
  ,IN iProcess_Type    BYTEINT
  ,IN iProcess_State    BYTEINT
        ,IN iTD_Server                 VARCHAR(30)
        ,IN iTD_User_Name               VARCHAR(30)
        ,IN iPassword_String           VARCHAR(30)
        ,OUT oReturn_Code               SMALLINT
        ,OUT oReturn_Message            VARCHAR(255)
  ,OUT oReturn_Script             CLOB(1000000000) -- used for returning generated TPT script
        ,OUT oReturn_Parameters         VARCHAR(4000)
        ,OUT oReturn_LogonText   VARCHAR(1000)
        
        )
 
Now only issue i'm getting is the oReturn_Script CLOB data type contents got truncated and only 117 lines of TPT script are returned in Python result variable. But when i call the same SP from SQLA i get complete script (181 lines of total length)
 
Below is python call SP 
results = session.callproc("GDEV1P_FF.GCFR_FF_TPTLoadCLOB_Generate",("LD_668_66_Customer",6,17,0,"localhost","dbc","dbc",teradata.OutParam("oCode"),teradata.OutParam("oMessage"),teradata.OutParam("oScript"),teradata.OutParam("oParams"),teradata.OutParam("oLogon")))
 
when i print(results.oScript) or write it in a file the oScript contents were truncated. As per Teradata Python module page says the VARCHAR, CLOB, UDT, ARRAY, etc. data types are returned as Python UNICODE String. 
 
How can i get the complete contents? Please help!

Tags: 
Forums: 

Problem with Fast Load script

$
0
0

Hello,

 

I am trying to create a new table in my Teradata database and then insert data in it proceding from another table.  The script I use is the following :

 

ERRLIMIT 25

 

DATABASE XXXXXX;

 

SET RECORD VARTEXT ";";

 

DROP TABLE XXXXXX.YYYYYY;

 

CREATE MULTISET TABLE XXXXXX.YYYYYY,

     NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO(

     date_utc VARCHAR(20),

     heure_utc VARCHAR(8),

     last_technology VARCHAR(15) CHARACTER SET LATIN NOT CASESPECIFIC COMPRESS ('2G','3G','4G','4G+','WIFI','UNKNOWN'),

     IMEI VARCHAR(100),

PRIMARY INDEX (IMEI);

 

DROP TABLE XXXXXX.YYYYYY_FASTERR1;

DROP TABLE XXXXXX.YYYYYY_FASTERR2;

 

BEGIN LOADINGXXXXXX.YYYYYY ERRORFILES XXXXXX.YYYYYY_FASTERR1, XXXXXX.YYYYYY_FASTERR2;

INSERT into XXXXXX.YYYYYY

  SELECT SUBSTR ( DATE_UTC, 1,10) AS DATE_UTC,

  SUBSTR ( DATE_UTC, 11,8) AS HEURE_UTC,

  LAST_TECHNOLOGY,

  IMEI

  FROM XXXXXX.ZZZZZZ;

 END LOADING;

.logoff;

The problem is that I get the following error message :

 

**** 15:42:19 FDL4814 Failure due to no input file specified

**** 15:42:19 DEFINE statement must be issued before INSERT

 

and I don't understand it, as I am not inserting data from a CSV file but from another table.

 

Is there a solution to correct this error ?

 

Thank you for any help.

 

Forums: 

TPT via .NET (C#)

$
0
0

Hi, I'm wondering the best way to trigger fastload from a Windows .net platform...  Looks like the TPT API might be the way to go - has anyone ever done a c# wrapper for this?  Or are there any other approaches anyone would recommend?

Tags: 
Forums: 

Record COmparison between oracle and Teradata

$
0
0

My source Data is in oracle and Target data is in Teradata Can you please provide me easy and quickest way to  compare data between 2 diffrent database.
There are 500 tables need to compare.

Forums: 

TPT Number of instances vs load tasks

$
0
0

Hi - If a TPT job uses 2 LOAD operator instances, will it use 2 load tasks on the Teradata side? I am trying to understand the relationship between instances and load tasks slots.

Forums: 

Issue with importing CLOB data using BTEQ

$
0
0

Hi, I have a simple BTEQ script (given below) to load CLOB data in an empty table.

.LOGON ;

DELETE FROM TTE ALL;

.IMPORT INDICDATA FILE = C:\TEST\Data_Clob;
USING (LOBTYPE CLOB(1000000000))
INSERT INTO TTE ('1', :LOBTYPE);

.LOGOFF;

When I execute the script, I get the below error:
 *** Error: The following occurred during an Access Module read:
 Unexpected data format.
 *** Warning: Out of data.
 *** Warning: EOF on INPUT stream.

Forums: 

SQL Assistant 15.00 Cursor lagging problems

$
0
0

Hello, 
I just updated to the newest SQL Assistant 15.00 and experienced lagging problems due to the cursor constantly switching back and forth from "text" to "select" mode(icon) in the "Query" field(where you type the sql). Two of my co-workers are having the same issue too. Was wondering if anyone else had this problem with any solution ? I did not run into this in 14.00.  Also tried to re-installed 15.00 but still can't solve this.
 
Thanks,
Stanley 

Forums: 

TDLOAD Utility - Input DATE and TIME STAMP format issue

$
0
0

Hi,
We are using TDLOAD mechanism to load Staging Layer from File. I belive the default DATE and Timestamp format in the input file which has been configured in DBS Utility would be accepted only. e.g. YYYY-MM-DD.
Is there any functionality/option in the TDLOAD mechanism to accept custom DATE and TIMESTAMP format specified in the Run Time?
Regards....Sanjoy

Forums: 

Can't find where to download TTU 14.1 , is it possible to use TTU 15.0 on a 14.1 installation?

$
0
0

Is it possible to use TTU 15.0 on a 14.1 installation?
any compatibility issues?
Thanks

Forums: 

Mac OS X: Where is the download for TTU 15.10

$
0
0

I have scoured the downloads section and am unable to find the installer for the Teradata Tools and Utilities 15.10 for OS X. I know it existed at some point, as I have bteq working on a MacBook Pro, I'm trying to migrate a bteq process to an iMac. If I check the ListProducts app on the MacBook it  shows 12 packages, all 15.10.00.XX, including BTEQ 15.10.00.02 (screenshot attached) When I install TTU 14.10 I see BTEQ as an option to install. When I install the 15.10 driver, I only have the option to install the driver and nothing else, it removes the BTEQ 14.
Help!

Forums: 

Teradata Statistics Wizard

$
0
0

I installed Teradata Tools and Utilities 15.00.00.00 but I do not find teradata statistics wizard installed. Can someone please help me with this?
Thanks

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>