Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

IMPORT issue with Teradata SQL assistant.

$
0
0

Hello,

 

Some Background:

I am trying to IMPORT a .csv file using SQL assistant... there are few columns which doesn't have values so ideally it should load NULL values (?) in the table, BUT... its loading ZEROS (incase of numeric fields) instead of NULLs. (incase of char fields its loading SPACE/Blank)

 

I have tried loading same table with same file on diffrent machine, and it is getting loaded with NULL values as expected!!

 

Questions:

 

1. why SQL assistant (same version) is behaving differently on diffrent machines?

2. Is there any configuration/settings needed to have this issue fixed?

 

Please advise... 

 

 

Thank you all..

 

Forums: 

BTEQ - Parametrize output File

$
0
0

Hi all,
 
I need to export data from Teradata to text files (Windows).
The first one is a .BAT file that only call the BTEQ application:
1 - bteq < file.sql
 
The second one exports the data:
2 -
.logon myhost/myuser,myuser

 

.EXPORT FILE=out.txt

select col1, col2

from table1;

 

.EXPORT RESET

 

.QUIT

 

Now I need to send a variable from .BAT file to .sql file giving the file name where I want to export data, for example, if the information is related to countries, I want to generate a file by country:

 

1 - 

bteq < file.sql  Canada

bteq < file.sql  USA

bteq < file.sql  Colombia

 

2 - 

 

.logon myhost/myuser,myuser

 

.EXPORT FILE=%Country%.txt

select col1, col2

from table1

where country='%1%';

 

.EXPORT RESET

 

.QUIT

How can I do this?
 
Thanks in advance.
 
Regards.

Forums: 

TPT10508: RDBMS error 2583 error in teradata tpt

$
0
0

Hi,
I am trying to insert data from prod to dev by using TPT Update operator. By mistake i have deleted all error tables including work table, log table.
Agin when i submit the script am got TPT10508: RDBMS error 2583 error in teradata tpt Error. Can you please suggest me how  resolve this issue.
Error :
UPDATE_OPERATOR: preparing target table(s)
UPDATE_OPERATOR: TPT10508: RDBMS error 2583: Worktable is missing for ISH_FEDXFGT_L1F5_DB.fxf_ship_rev_credit_comp during MLoad restart.
UPDATE_OPERATOR: disconnecting sessions
 
Thanks
Anil
 

Forums: 

TTU Installation on Linux RedHat

$
0
0

Can anyone tell me how I can get hold of the TTU download for Linux RedHat?  I've seen a few posts for requests but the only answers I've seen is pointing to documentation about the installation.  The only available download is for Windows.  I wanted to try to setup BTEQ but without the TTU download I can't do it.  Thanks.

Forums: 

where to download the installation for Teradata Schema Workbench

$
0
0

Do you know where to download the installation for Teradata Schema Workbench?
Thank you!
 

Forums: 

Question about Teradata Hadoop Connector

$
0
0

Hi,
after using the Teradata Hadoop Connector to load some data to a table in Teradata I noted the parameter "sourcepaths" assumes the folder is located in the hadoop machine assigned to the machine where I'm running the command.
Do you know if it's possible to use full path (including the server and port) to specificy the location of the source data to be imported? Right now I tried by setting the full uri to the file (something like hdfs://server:port/path/to/file) but the tool skips everything and assumes path/to/file is on the server is the one assigned to the machine where I'm running the command.
Thanks

Forums: 

MS Connector Attunity Teradata source export Unicode data error

$
0
0

Hi,
I am currenlty working on a project which is using SSIS to load data from Teradata to sql server. I found there is a SSIS component MS Connector Attunity which can be used for Teradata source and Destination with great performance of data loading. So I tried this tool in my SSIS package. The performance is better comparing with the ODBC. But after I did some testing with our DW data. There is a issue of Unicode data which I still don't find a solution. Can anyone help on this.
The detail of the issue:
1. Our Teradata (14.10) DW uses the UTF16 for the unicode char set.
2. I installed the MS Connector Attunity for Teradata and TPT and Teradata ODBC, the SSIS package can work well if the teradata source does not have a unicode column. But if one column uses Unicode CharSet, the dataflow in the SSIS pacakge will faile. The error message is:

[Teradata Source [31]] Error: TPT Export error encountered during Initiate phase. TPTAPI_INFRA: API306: Error: Conflicting data length for column(1) - Categ_GRP. Source column's data length is (16) Target column's data length is (24).

 

And this error is from the Teradata source side.

 

I researched a lot and it mainly because in my SSIS pacakge the driver is UTF8, but the teradata source is UTF16, UTF16 has 2 bytes for each char but UTF8 has 3 bytes. The Teradata source component get the metadata from the source server but it can't convert to UTF. That is the reason.

 

Does anyone else who has the same issue before like this?  Thanks a lot for your help.

 

Forums: 

SQL Assistant Tree Collapse/Expand Icon?

$
0
0

Does anyone know how to change the database tree collapse/expand icon in SQL Assistant?
I've noticed in screenshots that the default is the plus/minus, but mine shows a triangle. I've changed to every SQL Assistant theme and it doesn't change. I'm running SQLA 14.10 on Windows 7.
I've been going through the settings xml but not having much luck so far.
Thanks :-)

Forums: 

TPT/Easy Loader delimiter

$
0
0

Do TPT and Easy loader (13.10.00.02, moving to 14.10 later this year) support the use of ASCII control characters as delimiter? If so, how would x01 (Crtl A) be represented in the TPT script/tdload command? Tried below with no luck:

tdload -h td -u loaduser -p xxxxxx -f file1 --TargetTable mytable -d '\001'

 

Forums: 

TPT - BUFFERMAX error

$
0
0

Hi all, I'm pretty new to using TPT and am working with Version 14.00.00.07.  In the last week or so I have been finding that almost all of my jobs involving a simple load from one csv into one table are failing.
The error I get is: TPT19003 BUFFERMAXSIZE: 64260.  However I find once this error occurs the TPT job will then only attempt to process the file that caused this error in the first place, even when the job variables file and all possible places where file name is given are changed.
I see this as somewhat confusing and can only conclude that the log is somehow erroneous or there is some code that I don't have access to which contains the old file name still.
I especially am perplexed by this error as the files I pass to the TPT jobs are all loaded with no issues by another TPT job when I load them as a list of files parallely rather than just as one file.  Due to the sensitive nature of the data I'm handling I can't post verbatim code here, but here is some of the log with specifics censored.
TPT Load failed, running again to clear any locks
Teradata Parallel Transporter Version 14.00.00.07
Job log: <XXXXXXX>
Job id is <XXXXXXX>, running on blx20be01
Found CheckPoint file: <XXXXXXXX>
This is a restart job; it restarts at step Load_data.
Teradata Parallel Transporter DataConnector_data: TPT19006 Version 14.00.00.07
DataConnector_data Instance 1 restarting.
DataConnector_data: TPT19008 DataConnector Producer operator Instances: 1
DataConnector_data: TPT19003 ECI operator ID: DataConnector_data-1555
Teradata Parallel Transporter Load Operator Version 14.00.00.07
Insert_data: private log specified: <XXXXXXX>
Insert_data: connecting sessions
Insert_data: preparing target table
Insert_data: entering Acquisition Phase
DataConnector_data: TPT19222 Operator instance 1 processing file <XXXXXXXX>.      <--- This filename is unrelated to the job and not mentioned in any script or variable file, why is
DataConnector_data: TPT19003 BUFFERMAXSIZE: 64260                                                        it appearing here?
DataConnector_data: TPT19221 Total files processed: 0.
Insert_data: disconnecting sessions
Insert_data: Total processor time used = '1.33 Second(s)'
Insert_data: Start : Thu Apr 24 11:44:13 2014
Insert_data: End   : Thu Apr 24 11:44:19 2014
Job step Load_data terminated (status 12)
Job <XXXXXX> terminated (status 12)
 
Thanks in advance
 

Forums: 

Sql assistant complaining about dot net framework

$
0
0

Hello,
 
I am trying to install the Teradata utilities package but the sql assistant wont install. It keeps complaining that the dotnet framework aint present but I have installed the same. Please help!!!!!
 
Regards
Pramod

Forums: 

requesting help with implementing query band on ODBC for Crystal Reports

$
0
0

requesting help with implementing query band on ODBC for Crystal Reports.
We've done this with Business Objects  Universe data sources and now would like to implement query band on  ODBC  data sources for Crystal reports to Teradata using the Teradata ODBC driver.
I'm sure it is documented somewhere, I just haven't found it yet.  If someone could point me in the right direction I'd appreciate it.
Thank you.
 
 

Tags: 
Forums: 

BTEQ Batch Mode Logon

$
0
0

Hello All,
 
I tried loggin in BTEQ through the Batch mode:
Below is how i tried Logging
.LOGON 127.0.0.1/jugal,jugal <ENTER>
 
I am getting a Error as invalid logon.
Could you plz tell me. Where i am going wrong.

Forums: 

Unable to load TD driver into ODBC

$
0
0

I tried to upgrade my laptop TTU to 14.10. Now I can not connect to any database via ODBC. The error states that the "seup routines for teradata driver could not be found". I've uninstalled and re-installed several times. Still the same message.

Tags: 
Forums: 

TPT Data load from csv file

$
0
0

Hi ,
I am tryng to load the data from CSV file to TD table using TPT..I am using version14.10..TPT havng delimiter as ',' and also  in the csv file I have data in below format..How can I load this data.
SEQ_ID ,Prod_TYPE ,ACTIVE_ID ,Region_MAX_LAT ,Region_MAX_LONG ,AR_MIN_LAT ,AR_MIN_LONG ,AR_NUMERIC_ID ,CD_ACQUISITION_ID ,CD_SYSTEM_ID ,EFf_DATE ,EXP_DATE ,LOCAL_SYSTEM_ID ,PPDM_GUID ,PRED_NAME ,REMARK ,SOURCE_TD ,SOURCE_DOCUMENT ,ROW_CHANGED_BY ,ROW_CHANGED_DATE ,ROW_CREATED_BY ,ROW_CREATED_DATE ,ROW_QUALITY
Msrd,ABC_DEF,Y,,,,,4,,,,,,,'BKC Company, Ltd.',,EMW,,,,ETLR,,AWD
I tried with enclosing data with comma with double quotes...but its giving m below error..Please guide me on this..i am using Update operator as I need to use format expression for date column...So cant go for fastload..i,e. load operator in TPT..there is no EOF character..Also in the notepad for this csv, I have place cursor to next line..
Facing error due to data having comma and also dot ...and my csv have delimiter as comma
UPDATE_OPERATOR: entering DML Phase
UPDATE_OPERATOR: entering Acquisition Phase
FILE_READER: TPT19350 I/O error on file '/data/scripts/AR.csv'.
FILE_READER: TPT19003 Delimited Data Parsing error: Too many columns in row 1
FILE_READER: TPT19221 Total files processed: 0.
UPDATE_OPERATOR: disconnecting sessions
 
 
Kindly revert about how can I handle such data...
Thanks in advance..Awaiting for response..

Forums: 

BTEQ Batch mode, commands / SQL run from shared server.

$
0
0

Windows 7: BTEQ 13 and 14
Calling a batch file from end users machine shortcut, target is \\server\folder\file.bat, start in %temp% to prevent CMD prompt errors.
bteq < \\server\folder\commands.txt
commands.txt contains
login info
.EXPORT REPORT FILE=\\server\folder\output.csv
.RUN file \\server\folder\sql.txt
 
Problem:  When shortcut is run, it outputs bteq 0<\\server\folder\commands.txt and breaks.  (note the zero before the input)
in BTEQ, when just running the command straight from there bteq < \\server\folder\commands.txt everything works.
When everything is local, and shortcut points to a local batch file, with local commands, login, and sql files, it runs.
 
Thoughts/Suggstions?
 
Thank you.
 

Forums: 

BTEQ Import REPORT Mode. Growing to buffer

$
0
0

Hello All,
When i try to import the fixed length data. The data is growing to buffer to 65473 and getting imported to the table. The problem is in the USING part. But not able to figure how to overcome it.
Plz experts help me with this issue
Import Script:
.LOGON 127.0.0.1/jugal,jugal
.IMPORT REPORT FILE=/root/jugal/samples12
.QUIET ON
.REPEAT *
USING
EmpId (INTEGER),
EmpName (CHAR(5)).
DeptId (INTEGER)
INS INTO jugal.NewTable1 values (:EmpId,:Empname,:DeptId);
.LOGOFF
 
samples12.txt
10 Jugal 100
20 Jugal 200
30 Anil 300
 
Source Table:
CREATE TABLE jugal.NewTable1
(EmpId INT,
EmpName CHAR(10),
DeptId VARCHAR(10)
);

Forums: 

BTEQ Import REPORT Mode. Growing to buffer

$
0
0

Hello All,
When i try to import the fixed length data. The data is growing to buffer to 65473 and getting imported to the table. The problem is in the USING part. But not able to figure how to overcome it.
Plz experts help me with this issue
Import Script:
.LOGON 127.0.0.1/jugal,jugal
.IMPORT REPORT FILE=/root/jugal/samples12
.QUIET ON
.REPEAT *
USING
EmpId (INTEGER),
EmpName (CHAR(5)).
DeptId (INTEGER)
INS INTO jugal.NewTable1 values (:EmpId,:Empname,:DeptId);
.LOGOFF
 
samples12.txt
10 Jugal 100
20 Jugal 200
30 Anil 300
 
Source Table:
CREATE TABLE jugal.NewTable1
(EmpId INT,
EmpName CHAR(10),
DeptId VARCHAR(10)
);

Forums: 

Teradata v 14.10 and TTU v 13.10

$
0
0

Does Teradata Version 14.10 support TTU version 13.10?

Forums: 

BTEQ DATA FILE

$
0
0

Hello All,
 
I am new to teradata. When i ran the BTEQ script to export the data. It got exported. When i tried to import the same data in the table, the output is different. Please help me know where i am going wrong.
Tables
CREATESETTABLE anil.SourceT1
     (
      StudId INTEGER,
      StudName CHAR(8) CHARACTER SET LATIN NOT CASESPECIFIC,
      CourseId INTEGER)
PRIMARYINDEX ( StudId );
 
CREATESETTABLE anil.TargetT1 
 (
StudId INTEGER,
StudName CHAR(8) CHARACTER SET LATIN NOT CASESPECIFIC,
CourseId INTEGER)
PRIMARYINDEX ( StudId );
 
INS INTO anil.SourceT1  values (1,'Anilx',200);
 
BTEQ Scripts 
.logon 127.0.0.1/Anil143,341
.SET WIDTH 500;
.SET TITLEDASHES OFF;
.export data file=output1.txt
sel * from anil.SourceT1;
.export reset
.logoff
 
 --------------------------------------
.logon 127.0.0.1/Anil143,341
.import indicdata file=/root/anil/output1.txt
.quiet on
.repeat*
using
      in_StudId (INTEGER),
      in_StudName (CHAR(9)),
      in_CourseId (INTEGER)
 
insert into anil.TargetT1 values(:in_StudId,:in_StudName,:in_CourseId);
.logoff
------------------------------------------------------------
 
sel * from anil.TargetT1;
output
StudId               StudName  CourseId
1,241,513,984    nilx           25,632
 
Thanks in Advance.
Anil 

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>