Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

TPT Wizard driver error

$
0
0

When I try to use TPT wizard to connect to SQL Server using odbc DSN I am getting: There was an error initializing the driver for usage.

 

My configuration is the following:

 

Teradata RDBMS version 14.10.03.06

 

64bit Windows Server

 

TTU is patched to 14.10.15.00

 

I am using Microsoft ODBC Driver 11 for SQL Server.

 

Should I get this to work or is my only option to install the latest version of TTU 15 which bundles the DataDirect drivers? 

Forums: 

Issue with my first Teradata PT

$
0
0

Hi everybody,
I used the sample scripts that come with Teradata client for my first script.
What i succed to do is to create an empty table using these 2 scripts:
1*First script of configuration(jobvars1.txt):
 TargetTdpId          = 'XXX'
,TargetUserName       = 'YYYY'
,TargetUserPassword   = 'ZZZZ'
',DDLPrivateLogName    = 'ddlprivate.log'
,LoadPrivateLogName   = 'loadprivate.log'
,TargetErrorList      = ['3807']
,TargetTable          = 'EMP_TABLE'
,LogTable             = 'EMP_TABLE_LOG'
,ErrorTable1          = 'EMP_TABLE_E1'
,ErrorTable2          = 'EMP_TABLE_E2'
,SourceFileName       = 'flatfile1.dat'
,SourceFormat         = 'delimited'
,SourceTextDelimiter  = '|'
,OpenMode             = 'read'
 
2*Second script of configuration(jobvars1.txt):
DEFINE JOB qsetup1
(
APPLY
('DROP TABLE EMP_TABLE;')
,('CREATE TABLE EMP_TABLE(EMP_ID VARCHAR(10), EMP_NAME VARCHAR(10));')
TO OPERATOR ($DDL);
);
when i check in teradata using a basic SQL : select * from EMP_TABLE;
i have no records which is normal.
But when i execute the third script which is as follow:
DEFINE JOB qstart1
(
  APPLY $INSERT TO OPERATOR($LOAD)
  SELECT * FROM OPERATOR($FILE_READER);
);
and i execute the command "tbuild -f qstart1.txt -v jobvars1.txt -j qstart1" i have the following output:
$ tbuild -f qstart1.txt -v jobvars1.txt -j qstart1
Teradata Parallel Transporter Version 14.10.00.04
Job log: /opt/teradata/client/14.10/tbuild/logs/qstart1-27.out
Job id is qstart1-27, running on rgtdvdwhfr
Found CheckPoint file: /opt/teradata/client/14.10/tbuild/checkpoint/qstart1LVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter Load Operator Version 14.10.00.04
$LOAD: private log specified: loadprivate.log
Teradata Parallel Transporter $FILE_READER: TPT19006 Version 14.10.00.04
$FILE_READER Instance 1 directing private log report to 'dtacop-achouaf-22151170-1'.
$LOAD: connecting sessions
$FILE_READER: TPT19003 NotifyMethod: 'None (default)'
$FILE_READER: TPT19008 DataConnector Producer operator Instances: 1
$FILE_READER: TPT19003 ECI operator ID: '$FILE_READER-22151170'
$FILE_READER: TPT19222 Operator instance 1 processing file 'flatfile1.dat'.
$LOAD: preparing target table
$LOAD: entering Acquisition Phase
$LOAD: entering Application Phase
$LOAD: Statistics for Target Table:  'EMP_TABLE'
$LOAD: Total Rows Sent To RDBMS:      10
$LOAD: Total Rows Applied:            10
$LOAD: Total Rows in Error Table 1:   0
$LOAD: Total Rows in Error Table 2:   0
$LOAD: Total Duplicate Rows:          0
$FILE_READER: TPT19221 Total files processed: 1.
$LOAD: disconnecting sessions
$LOAD: Total processor time used = '0.403198 Second(s)'
$LOAD: Start : Thu Jan  8 10:37:13 2015
$LOAD: End   : Thu Jan  8 10:37:21 2015
Job step MAIN_STEP completed successfully
Job qstart1 completed successfully
Job start: Thu Jan  8 10:37:12 2015
Job end:   Thu Jan  8 10:37:21 2015
which means that everything went correctly however when i check my table i don't found any record in it.
Could anyone helps me on that please?
 
 
 

Forums: 

FastLoad parallel sessions limit. Is it overridable?

$
0
0

Many of you are already know, that Teradata has only 30 CONCURRENT Fast-Load slots awailable. In other words, it means that you can have only 30 simultanious dataflows for loading data, not more.
For our division it turned into bottleneck, because we need much more parallel transfers. Sometime, our sources (OLTP database for example) are situated many miles away and therefore very slow.

So, please, share your experience, how do you overcome this? Maybe there are any hidden tricks that we don't know?
Thanks.

Forums: 

TPT Loading DateTime (SQL Server) with UTF-16 char set

$
0
0

Hi all,
I am having difficulties loading the DateTime data type with TPT from SQL server into teradata using the UTF-16 char set. I keep getting the 2673 error code '2673 SOURCE PARCEL LENGTH DOES NOT MATCH DATA THAT WAS DEFINED'. When I leave out the DateTimes my script loads fine, it is only when I add a DateTime field that things go sour. Any ideas anyone on how to correctly load DATETIME data type into TIMESTAMP(3)? Exporting the the data to a file with bcp utility and doing a fastload works fine.
this is my script:

USING CHAR SET UTF16

DEFINE JOB TPT_LOAD_Dim_Billing_Document_Type

DESCRIPTION 'ODBC LOAD Dim_Billing_Document_Type TABLE'

(

  DEFINE SCHEMA ODBC_Dim_Billing_Document_Type

  (

    SK_Billing_Document_Type     INTEGER,

    Billing_Document_Type_Code   VARCHAR(100),

    Billing_Document_Type_Name   VARCHAR(200),

    Analytic_Relevence_Indicator VARCHAR(20),

    Billing_Document_Group_Code  VARCHAR(20),

    Record_Source_Timestamp      CHAR(46), 

    Record_Checksum_SCDType1     INTEGER,

    Record_Checksum_SCDType2     INTEGER

  

  );

 

  

  DEFINE OPERATOR DDLOperator()

  TYPE DDL

  ATTRIBUTES

  (

    VARCHAR PrivateLogName = 'ddl_log',

    VARCHAR TdpId = @MyTdpId,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR WorkingDatabase = @MyDatabase,

    VARCHAR ARRAY ErrorList = ['3807','3803']

  );

  

  

  DEFINE OPERATOR ODBC_Operator

    DESCRIPTION 'Teradata Parallel Transporter ODBC Operator'

    TYPE ODBC

    SCHEMA ODBC_Dim_Billing_Document_Type

    ATTRIBUTES

    (

    VARCHAR PrivateLogName = 'odbc_log',

    VARCHAR DSNName = @jobvar_SQL_SERVER_DNS,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR SelectStmt = 'Select  SK_Billing_Document_Type, 

      Billing_Document_Type_Code, 

      Billing_Document_Type_Name, 

      Analytic_Relevence_Indicator, 

      Billing_Document_Group_Code, 

      convert(char(23), Record_Source_Timestamp, 121), 

      Record_Checksum_SCDType1, 

      Record_Checksum_SCDType2 

      FROM [WILD_DWH].[DWH].[Dim_Billing_Document_Type];'

    );

  

  

  DEFINE OPERATOR Load_Operator

  TYPE LOAD

  SCHEMA *

  ATTRIBUTES

  (

    VARCHAR ErrorTable1 = 'Dim_Billing_Document_Type_errors1',

    VARCHAR ErrorTable2 = 'Dim_Billing_Document_Type_errors2',

    VARCHAR LogTable = '"D0_EU_STG_T"."Dim_Billing_Document_Type_Log"',

    VARCHAR PrivateLogName = 'load_log',

    VARCHAR TargetTable = '"Dim_Billing_Document_Type"',

    VARCHAR TdpId = @MyTdpId,

    VARCHAR UserName = @MyUserName,

    VARCHAR UserPassword = @MyPassword,

    VARCHAR WorkingDatabase = @MyDatabase

  );

  STEP drop_and_create_the_table

  (

    APPLY

    ('DROP TABLE "Dim_Billing_Document_Type_errors1";' ),

    ('DROP TABLE "Dim_Billing_Document_Type_errors2";' ), 

    ('DROP TABLE "Dim_Billing_Document_Type";' ),

    ('CREATE MULTISET TABLE "Dim_Billing_Document_Type" ( SK_Billing_Document_Type     INTEGER NOT NULL ,

                                                          Billing_Document_Type_Code   VARCHAR(50) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,

                                                          Billing_Document_Type_Name   VARCHAR(100) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Analytic_Relevence_Indicator VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Billing_Document_Group_Code  VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,

                                                          Record_Source_Timestamp      TIMESTAMP(3) NOT NULL,

                                                          Record_Checksum_SCDType1     INTEGER NOT NULL ,

                                                          Record_Checksum_SCDType2     INTEGER NOT NULL  

                                                           );')

      TO OPERATOR (DDLOperator);

  );

  STEP load_the_data

  (

    APPLY

    ('INSERT INTO "Dim_Billing_Document_Type" ( :SK_Billing_Document_Type, 

                                                :Billing_Document_Type_Code, 

                                                :Billing_Document_Type_Name, 

                                                :Analytic_Relevence_Indicator, 

                                                :Billing_Document_Group_Code, 

                                                :Record_Source_Timestamp, 

                                                :Record_Checksum_SCDType1, 

                                                :Record_Checksum_SCDType2

                                                );')

      TO OPERATOR (Load_Operator)

      SELECT 

      SK_Billing_Document_Type, 

      Billing_Document_Type_Code, 

      Billing_Document_Type_Name, 

      Analytic_Relevence_Indicator, 

      Billing_Document_Group_Code, 

      Record_Source_Timestamp, 

      Record_Checksum_SCDType1, 

      Record_Checksum_SCDType2

      FROM OPERATOR (ODBC_Operator);

  );

);
 
S-
 

Forums: 

TPT : ERROR using BYTE data comparison in JOBSCRIPT

$
0
0

hi,
I am having problem with the following script.
The table I am using is

Create table Testdb.testtable

(

 COL1 CHAR(10),
 COL2 DECIMAL(9)

) PRIMARY INDEX(COL1);

and the following data
000179 AAAAAAAAAA     
            CCCCCCCCCCF44444   --------------> byte values "data1"
            1111111111  F00000
-----------------------
000180 BBBBBBBBBB
            CCCCCCCCCCF00000   --------------> byte values "data2"
            2222222222  F00001                
DEFINE JOB TESTJOB       
DESCRIPTION 'TESTTABLE'   
(                             
   DEFINE SCHEMA TEST_SCHEMA
   DESCRIPTION 'TEST_SCHEMA'           
 (
 COL1 CHAR(10),
 NULL01 BYTE,
 COL2 DECIMAL(9)
 );
   DEFINE OPERATOR UPDATE_OPERATOR
   DESCRIPTION 'UPDATE OPERATOR' 
   TYPE UPDATE                   
   SCHEMA *                      
   ATTRIBUTES    
 (
 VARCHAR TDPID=@TDPID,                          
 VARCHAR USERNAME=@USERNAME,                    
 VARCHAR USERPASSWORD=@USERPASSWORD,            
 VARCHAR PRIVATELOGNAME='<PVTLOGNAME>',
 INTEGER MAXSESSIONS=10,                        
 INTEGER MINSESSIONS=1,                         
 VARCHAR TARGETTABLE='TESTDB.TESTTABLE',  
 VARCHAR ERRORTABLE1='TESTDB.ET_TESTTABLE',
 VARCHAR ERRORTABLE2='TESTDB.UV_TESTTABLE',
 VARCHAR LOGTABLE='TESTDB.LG_TESTTABLE',  
 VARCHAR WORKTABLE ='TESTDB.WT_TESTTABLE'                 
 );
   DEFINE OPERATOR READ_OPERATOR 
   DESCRIPTION 'READ FILE'       
   TYPE DATACONNECTOR PRODUCER   
   SCHEMA ACC_OPN_SCHEMA         
   ATTRIBUTES                    
   (                             
       VARCHAR FILENAME='DD:DDIN',
       VARCHAR INDICATORMODE='N', 
       VARCHAR OPENMODE='READ',   
       VARCHAR FORMAT='UNFORMATTED'
   ) ;                        
   STEP LOAD_TABLES
   (              
     APPLY        
     (
    'INSERT INTO TESTDB.TESTTABLE
  (
  COL1
  ,COL2
  )
  VALUES
  (
  TRIM(COL1),
  :COL2
  );'
   ) TO OPERATOR(UPDATE_OPERATOR())
 SELECT
 COL1
 ,CASE WHEN NULL01 = 'FF'XB THEN NULL
  ELSE COL2 END AS COL2
 FROM OPERATOR(READ_OPERATOR());
   );
);
 
The error i am getting is
TPT_INFRA: Syntax error at or near line 80 of Job Script File 'dd:SYSIN':
TPT_INFRA: At "SELECT" missing SEMICOL_ in Rule: STEP                   
Compilation failed due to errors. Execution Plan was not generated.     
Job script compilation failed.                                          
This is because of the line
 
CASE WHEN NULL01 = 'FF'XB THEN NULL
ELSE COL2 END AS COL2
 
as byte comparison is failed to be recognised as such,i think :)
 
Also if i define the COL2 as CHAR in the layout , the Data2 is going to error tables and data1 is going to the target table.
as such by specifying the jobscript as "SELECT * FROM OPERATOR" the data1 gets inserted in to error table and data2 in to target table.
can you guys help me with this?

Forums: 

Data Mover Help

$
0
0

I am new to Teradata and Data Mover. I am trying to create a Data mover job via Viewpoint and when I save it, it says that the job already exists. But I don't see it in the jobs list both from viewpoint or command line. How can I get rid of this duplication?
 
Thanks
Chris

Forums: 

Number of Utility Load Slots

$
0
0

Hi, 
Some utility load slot questions...
1.  Are there any guidelines for the number of utility load slots a Teradata system can have?  Is there a limit?  How is it determined.
2.  How do we configure Teradata to have more or less load slots? 
3.  Exactly what is a utility load slot?  For example, is it a dedicated AWT?
Jerry

Forums: 

Teradata Sql Assistant 14.10 Ctrl + Tab and Ctrl Page Up & Page Down not working

$
0
0

Hello.  First post...  Was recently upgraded at work, the features in the Subject: no longer work.  I looked in Tools - Settings and Tools - Customize but didn't see anything...
 
Thanks in advance...
Kirk

Forums: 

SQL Assistant Installation Error

$
0
0

Hello,
I have Windows7 Operating System and IE verison 9. When I try to install TD SQL Asssistant, it give me this error:
MDAC 2.6 requires Microsoft IE 4.01 Service Pack 2 or later.
I have IE version 9 on my machine. Can you help?
Thanks Devi

Forums: 

TPT - source data enclosed with double quotes and delimited by pipe (|)

$
0
0

Hi,
 
I am new to Teradata. I am trying to load the data from csv file to Teradata 13.0 on VMWARE using TPT, where in columns are enclosed with double quotes (") and delimited with pipe ( | ).
 
e.g.
 "1001"|"Amit"|"FIN"|"10001"|"1981-10-10"

"1002"|"Manish"|"FIN"|"25000"|"1990-01-01"

"1003"|"Sid"|""|"2000"|"1986-04-22"

"1004"|"Macho"|"GM"|""|"1972-05-19"

 

And I am using the below script - 

 

 

DEFINE JOB LOAD_M1_TABLE_FROM_FILE

DESCRIPTION 'LOAD SAMPLE M1 TABLE FROM A FILE'

(

  DEFINE SCHEMA M1_SCHEMA

  DESCRIPTION 'SAMPLE M1 SCHEMA'

  (

   EMP_ID   varchar(12),

   EMP_NAME varchar(12),

   EMP_DEPT varchar(5),

   SALARY   varchar(12),

   DOB      varchar(12)

  );

 

  DEFINE OPERATOR DDL_OPERATOR()

  DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DDL OPERATOR'

  TYPE DDL

  ATTRIBUTES

  (

  VARCHAR PrivateLogName = 'tpt_script_ddloper_log',

  VARCHAR TdpId          = 'dbc',

  VARCHAR UserName       = 'dbadmin',

  VARCHAR UserPassword   = 'dbadmin',

  VARCHAR AccountID,

  VARCHAR ErrorList      = '3807'

  );

 

  DEFINE OPERATOR LOAD_OPERATOR()

  DESCRIPTION 'TERADATA PARALLEL TRANSPORTER LOAD OPERATOR'

  TYPE LOAD

  SCHEMA M1_SCHEMA

  ATTRIBUTES

  (

   VARCHAR PrivateLogName    = 'tpt_script_loadoper_privatelog',

   INTEGER MaxSessions       =  32,

   INTEGER MinSessions       =  2,

   VARCHAR TargetTable       = 'M1',

   VARCHAR TdpId             = 'dbc',

   VARCHAR UserName          = 'dbadmin',

   VARCHAR UserPassword      = 'dbadmin',

   VARCHAR AccountId,

   VARCHAR ErrorTable1       = 'tpt_script_LOADOPER_ERRTABLE1',

   VARCHAR ErrorTable2       = 'tpt_script_LOADOPER_ERRTABLE2',

   VARCHAR LogTable          = 'tpt_script_LOADOPER_LOGTABLE'

  );

 

  DEFINE OPERATOR FILE_READER()

  DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

  TYPE DATACONNECTOR PRODUCER

  SCHEMA M1_SCHEMA

  ATTRIBUTES

  (

   VARCHAR PrivateLogName    = 'tpt_script_dataconnoper_reader_privatelog',

   VARCHAR FileName          = 'code*.txt',

   VARCHAR IndicatorMode     = 'N',

   VARCHAR OpenMode          = 'Read',

   VARCHAR Format            = 'delimited',

   VARCHAR NullColumns       = 'Y',

   VARCHAR OpenQuoteMark = '"',

   VARCHAR CloseQuoteMark = '"',

   VARCHAR TextDelimiter     = '|',

   VARCHAR trimchar          = '"',

   VARCHAR MultipleReaders   = 'Y',

   varchar trimcolumns       = 'both',

   VARCHAR SkipRowsEveryFile = 'Y',

   INTEGER SkipRows = 1

  );

 

  STEP setup_tables

  (

    APPLY

    ('DROP TABLE M1;'),

    ('DROP TABLE M1_LOADOPER_ERRTABLE1;'),

    ('DROP TABLE M1_LOADOPER_ERRTABLE2;'),

    ('DROP TABLE M1_LOADOPER_LOGTABLE;'),

    ('CREATE TABLE M1(EMP_ID   varchar(10), EMP_NAME varchar(10), EMP_DEPT varchar(3), SALARY   varchar(10), DOB varchar(10));')

    TO OPERATOR (DDL_OPERATOR() );

  );

 

  STEP load_data_from_file

  (

    APPLY

     ( 'INSERT INTO M1(:EMP_ID, :EMP_NAME, :EMP_DEPT, :SALARY, :DOB);')

     TO OPERATOR (LOAD_OPERATOR() [1] )

 

     SELECT * FROM OPERATOR (FILE_READER() [1] );

 

  );

 );

 

 

 

 

 

 

 

 

Data is getting loaded but double quotes are getting included in data.

 

 

In Teradata data is as below - 

 

EMP_ID      EMP_NAME    EMP_DEPT  SALARY      DOB

----------  ----------  --------  ----------  ----------

"1004"     "Macho"    "GM       ""         "1972-05-1

"1002"     "Manish"   "FI       "25000"    "1990-01-0

"1007"     "Sidhu"    ""       "2000"     "1986-04-2

"1008"     "Mach"     "GM       ""         "1972-05-1

"1003"     "Sid"      ""       "2000"     "1986-04-2

"1006"     "Mani"     "FI       "25000"    "1990-01-0

 

 

If column length is less, then the data is trimmed before getting loaded. e.g. column EMP_DEPT and DOB.

 

 

Can you please help me out urgently. I am stuck up on this for last 3 days.

 

 

Regards

Atul

 

 

Forums: 

BTEQ DLL question - entry point not found

$
0
0

We have a pair of Windows Server 2008R2 machines (Xeon, 4 cores; 32GB RAM each) used as an Informatica 9.6.1 PowerCenter grid, which have several Teradata tools installed. 
ETL developers complained to me that BTEQ doesn't run right in the grid so I had to shut one of the nodes down.  To test, I opened a command window on each machine and attempted to run bteq. 
Server 1 can run BTEQ just fine, but Server 2 throws an error:
bteq.exe - Entry Point Not Found
The procedure entry point ecln_cleanupOne could not be located in the dynamic link library icuuctd36.dll
I've checked all environment variables and between Server 1 and Server 2 they are identical, even down to the order of the entries under PATH. 
I've tried uninstalling / reinstalling the ODBC for Teradata 32 bit and 64 bit, as well as CLIv2 based on some of what I've found online, but to no avail.
I'm suspecting there is a corruption in one of the components that bteq relies on, but I'm not familiar enough with the tools to know which one. 
Does anyone have any thoughts on what is missing somewhere or where to go next with this? 
There's quite a list of installed Teradata components (I inherited these machines this way, not sure what we have that might be extraneous but I'm sure there is clutter):
Teradata Administrator 13.10.0.2
Teradata Administrator 13.10.0.3
Teradata ARC 13.10.0.7
Teradata BTEQ 13.10.0.5
Teradata C PP2 13.10.0.3
Teradata ClIv2 13.10.0.8
Teradata CLIv2 nt-x8664 13.10.0.2
Teradata Data Connector 13.10.0.2
Teradata Data Connector nt-8664 13.10.0.9
Teradata FastExport 13.10.0.3
Teradata FastLoad 13.10.0.3
Teradata GSS Client nt-i386 13.10.0.3
Teradata GSS Client nt-i386 13.10.0.6
Teradata GSS Client nt-x8664 13.10.0.6
Teradata GSS Client nt-x8664 13.10.2.2
Teradata GSS Client nt-x8664 13.10.4.1
Teradata GSS Client nt-x8664 14.10.1.6
Teradata Index Wizard 13.10.0.3
Teradata MultiLoad 13.10.0.3
Teradata Named Pipes Access Module 13.10.0.3
Teradata Named Pipes Access Module nt-x8664 13.10.0.3
Teradata OLE DB Acess Module 13.10.0.2
Teradata Parallel Transporter API 13.10.0.2
Teradata Parallel Transporter API x8664 13.10.0.7
Teradata Parallel Transporter Base nt-8664 13.10.0.3
Teradata Parallel Transporter Export Operator 13.10.02
Teradata Parallel Transporter Export Operator x8664 13.10.0.6
Teradata Parallel Transporter Infrastructure 13.10.0.2
Teradata Parallel Transporter Load Operator 13.10.0.2
Teradata Parallel Transporter Stream Operator 13.10.0.2
Teradata Parallel Transporter Stream Operator x8664 13.10.0.7
Teradata Parallel Transporter Update Operator 13.10.0.2
Teradata Parallel Transporter Update Operator x8664 13.10.0.4
Teradata Performance Monitor Object 13.10.0.1
Teradata SQL Assistant 13.10.0.4
Teradata Statistics Wizard 13.1.0.2
Teradata System Emulation Tool 13.10.0.4
Teradata TPump 13.10.0.2
Teradata Visual Explain 13.10.0.4
Teradata Workload Analyzer 13.10.0.3
Many thanks in advance to anyone who has a suggestion.

Forums: 

Oracle.Net Not under Connection Dialogue

$
0
0

I have installed Teradata Sql Assistant 15.0 and all of the other tools under TTU 15 package.
I have Windows 7 64 bit machine
I noticed that Oracle.Net was not in the connection dialogue menu under Provider.
I have ODP.Net installed for other Oracle applications.
Can someone, please, help on how to get Oracle.Net under connection dialogue or propose another connection method excluding ODBC.
I am also using TNS names for another Oracle application if it helps.
Appreciate all of the help in advance.

Forums: 

SQL Assistant 15.00

$
0
0

Hi there,
When running queries in a Query Tab, I remember that in previous versions, the colour of the tab would change to indicate that a query was running, and it would change again to indicate when the query had completed.
This functionality isn't working for me in 15.00 - whilst the query itself is highlighted while being executed, the colour of the never changes... I wonder if there's a setting I have missed?
Thanks
 

Forums: 

Teradata SQL Assistant (SQL to highlight columns in results)

$
0
0

Hello:
Does anyone know how to write code to have a column, or columns, be highligted in the result set?  Or, maybe have the font a different color...?
I can manually double-click a row header once the results are returned and the format cells window opens, from here I can highlight a column but I'd rather not have to search for that, highlight, etc.
Thanks,
Kirk

Forums: 

TDA connection

$
0
0

hi dnoeth,
it is so wired , i could use my TDA to connect my ODBC . I can't do it today;
I did like usually, open the VM, use /etc/init.d/ tpa start and then tpestate -a; bteq logon  seccessfully; finally use TDA to connect ,but i can't but  10060 WSA E TimeOut...
Shayne.

Forums: 

Need to load from Table 1 to Table 2. Which utility work well here and WHY?

$
0
0

Need to load from Table 1 to Table 2. Which utility work well here and WHY?
Please explain the limitations and advantages against each utility.

Forums: 

SQL Assistant - View tables in the Database Explorer

unable to varchar(max) / nvarchar(max) data from sql server to Teradata using TPT

$
0
0

Hi All,
 
I am loading varchar(max) / nvarchar(max) data from sql server to Teradata using TPT and data direct drivers. The trimmed length of varchar(max) column is 5356651. I am unable to load this data as data dorect drivers replaces varchar(-1) instead of varchar(max).
Any idea how I can load this data into teradata?

 

Tags: 
Forums: 

Fastload Skips Header when using from Java

$
0
0

Hi All,
          I am new to Teradata and am currently trying out the JDBC fastload. In my application I am trying to use Fastload using JDBC. The first record is skipped everytime I try to export the data from a csv file to Teradata db. How can I make sure that the first record is not skipped?
Please advice.
   
Regards
Shekhar      

Forums: 

Arcmain Estimates

$
0
0

Hi,
Can we estimate time used for arcmain data copy/restore using linear scale from sample data? i.e. 5 gb is copied in 5 minutes then 1 tb in 1000 minutes.
Regards,
Harpreet

Tags: 
Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>