Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

Problem during acceesing the function created on the top of the JAR file installed on the database

$
0
0

Hi Tom,

We are in process of creating a Java user defined Function and in order to achieve that  we have deployed the JAR File in the database and

 creating a function/procedure on the top of it  using the below two approaches :

 

Approach 1) Using the Teradata JUDF wizard Plug -ins in the eclipse  as per below steps:

  

     a) Installed the teradata plug-ins in eclipse as per the below instructions:

          https://downloads.teradata.com/download/tools/teradata-plug-in-for-eclipse

 

     b)  Created the Teradata Project and imported the Java libraries ,java source code as per the steps in Teradata JUDF Wizard.

          Setting up the Input and Output parameters of the function, as per the instructions :

 

           http://developer.teradata.com/tools/articles/creating-a-simple-java-user-defined-function-using-the-teradata-plug-in-for-eclipse

 

     c) Deployed the JAR file and Generated SQL for the Function in the database 

 

 

JAR File gets deployed and function gets created in database but while executing the function resulting in to error:

[3604] Cannot place a NULL into NOT NULL field 

 

Note :Things to consider that the Java source is existing in production and working fine with no issues and returns results while running through 

the ECLIPSE ( Run --> Run Configurations (mentioning the Arguments)and by clicking on the Run , output gets displayed with no issues.

 

 

 

 

Approach 2)  Deploying using the Teradata External stored procedures -

 

 

 

placed the jar file at this path and invloking the bteq session:

C:\Program Files (x86)\Java\jdk1.8.0_73\bin>bteq

Teradata BTEQ 15.10.01.00 for WIN32. PID: 4736
Copyright 1984-2015, Teradata Corporation. ALL RIGHTS RESERVED.
Enter your logon or BTEQ command:
 

 

CALL SQLJ.INSTALL_JAR(‘CJ!jar1.jar’, ‘jar1’, 0);

 

JAR files gets successfully deployed but problem comes during creating a External procedure on the top of it 

 

 

DATABASE syslib;

 

CREATE PROCEDURE myjxsp1

( INOUT R INTEGER )

LANGUAGE JAVA NO SQL

PARAMETER STYLE JAVA

EXTERNAL NAME 'jar1:test.Main.class.myjxsp1';    --> test is the name of the package in the Java manifest file  and main is the class .

*** Failure 7980 A JAVA method in the specified Jar which matches that in t
he EXTERNAL NAME clause was not found/etc/opt/teradata/tdconfig/jarlib/tdbs_1
001/jarlib_1001_510889_1.jar.
                Statement# 1, Info =0
*** Total elapsed time was 2 seconds.
 
Looks like its referring to the different path . can you suggest on how to resolve this?
 
 
Regards,
nishant
 

Forums: 

Cursor remains on last line during Find and Replace in SQL Assistant Ver 15.10.1.3

$
0
0

The position of the cursor used to remain at the top of the SQL member in earlier Versions, it now drops to the last line affected by the Find and Replace.
Would you please change the functionality of the Find and Replace to keep the cursor position at the top of the SQL Member?

Forums: 

Issues in Multiload, Insert and Update for Multiple tables

$
0
0

Hi Gurus,
I just need the information about the   Candidate records not applied is 1 and i am not able to find the information about that.
We are pulling the data from source system(Oracle) into the target table(Teradata), When we are inserting the data into the Target table through the Multi Load tool from the source system file, we are inserting a NULL data into the one of the column(STRING) to the target table and all remaining columns of data is pulling into the Target Table and only one column is missing of data and we are not sure the reason for missing.
FYI.. We are pulling the source data(100 records) and loading target table is 91 records.
can someone explain about that..? or anything we are missing for analyzing the data and the data looks good into source and target
We see the Candidate records not applied is 1 in .rpt file and we don’t see any failure or any other issues in .rpt file. Is that any issue..?
Oracle defined the column type
STRING                       VARCHAR2(160)
Teradta defined the column type
STRING VARCHAR(160) CHARACTER SET
 
Thanks

Forums: 

TPT wizard finish button to run is not doing anything

$
0
0

I am using TTU 15.00 (TPT wizard) to transfer data from one source to another. There is no rsponse when i click on finish button to run  the job. Did anybody had same issue and how it was resolved?

Forums: 

How to get values from TPT log file?

$
0
0

Hello,
This is my TPT log file
Teradata Parallel Transporter Version 14.00.00.03
Job log: /opt/teradata/client/14.00/tbuild/logs/mytest-175.out
Job id is mytest-175, running on xxxxxx
Found CheckPoint file: /opt/teradata/client/14.00/tbuild/checkpoint/ mytestLVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter SQL Selector Operator Version 14.00.00.03
SQL_SELECTOR: private log specified: selector_log
Teradata Parallel Transporter DataConnector Version 14.00.00.03
FILE_WRITER Instance 1 directing private log report to 'dataconnector_log-1'.
FILE_WRITER Instance 1 restarting.
FILE_WRITER: TPT19007 DataConnector Consumer operator Instances: 1
FILE_WRITER: TPT19003 ECI operator ID: FILE_WRITER-23162
SQL_SELECTOR: connecting sessions
SQL_SELECTOR: restarting the job
FILE_WRITER: TPT19222 Operator instance 1 processing file '/xxxx/xxxx/xxxxx/files/history/dataset1_201209.csv'.
SQL_SELECTOR: sending SELECT request
SQL_SELECTOR: retrieving data
SQL_SELECTOR: Total Rows Exported:  20
SQL_SELECTOR: finished retrieving data
SQL_SELECTOR: disconnecting sessions
FILE_WRITER: TPT19221 Total files processed: 1.
SQL_SELECTOR: Total processor time used = '0.11 Second(s)'
SQL_SELECTOR: Start : Thu Nov 21 01:05:20 2013
SQL_SELECTOR: End   : Thu Nov 21 01:05:26 2013
Job step MAIN_STEP completed successfully
Job mytest completed successfully
from the above log file I want to make use of certain values.
for example
from this
SQL_SELECTOR: Total Rows Exported:  20
i want to make use of 20 to insert in a db table. Similarly i need to take total processor time used,total files processed etc.,
how to do this ? do i need to parse entire text file or is there any alternative way of doing it? 
thnks.

Forums: 

Error while loading DATE column using TPT

$
0
0

Hi,
I am new to TPT (Version 14.10.00.04). I have a csv file from which i am trying to load data using TPT. Data in csv file has date defined in 'DD/MM/YYYY' format. The table has a DATE column with format defined as 'DD/MM/YYYY' and i am getting error while loading the data from csv file onto this Teradata table. Error is - "LOAD_OPERATOR: TPT10508: RDBMS error 3618: Expression not allowed in Fast Load Insert, column PTB_DATEVALUE"
Below is the script that i am using and is causing an error:
 
DEFINE JOB  TD_LD(
 DEFINE OPERATOR LOAD_OPERATOR
        TYPE LOAD
        SCHEMA *
        ATTRIBUTES
        (
        VARCHAR PrivateLogName ,VARCHAR TdpId, VARCHAR UserName ,
        VARCHAR UserPassword ,VARCHAR TargetTable , VARCHAR LogTable ,
        VARCHAR WorkingDatabase ,VARCHAR ErrorTable1 , VARCHAR ErrorTable2,
        INTEGER  MaxSessions
        );
 DEFINE OPERATOR DDL_OPERATOR()
        TYPE DDL
        ATTRIBUTES
        (
                VARCHAR TdpId = '<IP Address>',
                VARCHAR UserName = '<userid>',
                VARCHAR UserPassword = '<password>',
                VARCHAR WorkingDatabase = 'STGDB',
                VARCHAR ARRAY ErrorList = ['3807','3803','5980']
        );
 DEFINE SCHEMA TABLE1_SCH
        (
       TESTCASENUM VARCHAR(50) ,
      PTB VARCHAR(50) ,
      CTB VARCHAR(50) ,
      PTB_DATEVALUE VARCHAR(50) ,
      CTB_DATEVALUE VARCHAR(50)
                );

       DEFINE OPERATOR FILE_READER
        TYPE DATACONNECTOR PRODUCER
        SCHEMA TABLE1_SCH
        ATTRIBUTES
        (
        VARCHAR DirectoryPath ,VARCHAR FileName ,VARCHAR Format,
        VARCHAR OpenMode ,VARCHAR TextDelimiter ,VARCHAR AcceptMissingColumns
        ,VARCHAR AcceptExcessColumns
       );
STEP SETUP_TABLES
 (

  APPLY
  ('
CREATE MULTISET TABLE STGDB.TABLE1 ,NO FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
             TESTCASENUM DECIMAL(8,2),
      PTB VARCHAR(35) CHARACTER SET LATIN NOT CASESPECIFIC,
      CTB VARCHAR(35) CHARACTER SET LATIN NOT CASESPECIFIC,
      PTB_DATEVALUE DATE FORMAT 'DD/MM/YYYY',
      CTB_DATEVALUE DATE FORMAT 'DD/MM/YYYY'
)
PRIMARY INDEX ( TESTCASENUM );
')

TO OPERATOR (DDL_OPERATOR());
);

STEP LOAD_TABLE
 (
 APPLY
   ('INSERT INTO STGDB.TABLE1
   (
 TESTCASENUM ,
      PTB  ,
      CTB  ,
       PTB_DATEVALUE  ,
      CTB_DATEVALUE  
 )
    VALUES
    (

      :TESTCASENUM ,
      :PTB  ,
      :CTB  ,
      CAST(:PTB_DATEVALUE  AS DATE FORMAT ''DD/MM/YYYY''),
      CAST(:CTB_DATEVALUE  AS DATE FORMAT ''DD/MM/YYYY'')
);
')

  TO OPERATOR
 (
  LOAD_OPERATOR[4]
  ATTRIBUTES
  (
    PrivateLogName = 'table1_log',
    TdpId = '<IP Address>,
    UserName = '<userid>',
    UserPassword = '<password>',
    TargetTable = 'stgdb.table1',
    LogTable = 'stgdb.table1_log',
    WorkingDatabase = 'stgdb',
    ErrorTable1 = 'stgdb.table1_E1',
    ErrorTable2 = 'stgdb.table1_E2',
    MaxSessions = 30
)
 )

 SELECT
 TESTCASENUM ,
      PTB,
      CTB  ,
      PTB_DATEVALUE  ,
      CTB_DATEVALUE  
FROM OPERATOR
 (
  FILE_READER[4]
  ATTRIBUTES
  (
    DirectoryPath = '/sample/',
    FileName = 'test.csv',
    Format = 'Delimited',
    OpenMode = 'Read',
    TextDelimiter = ',',
         AcceptMissingColumns = 'Y',
        AcceptExcessColumns = 'Y'
)
 );
);
);
 
Please help in understanding where the issue is. Need your valuable suggestions to resolve the issue.

Forums: 

Multi Load UTY0006 Input error in the FIELD command

$
0
0

I am trying to load a table from a file generated from fast export. I am getting below error.

UTY0006 Input error in the FIELD command at position 31: "(15)"
snippet from script 

.field IBNRSkadeestimat *FLOAT(15);
I shall post the complete script tomorrow.
 
Plz advise.

Forums: 

TPT without schema for STREAM Operator

$
0
0

Hi All,
With TTU 14, we could create templates for TPT without the need to specify SCHEMA definition. I was able to successfully implement it for EXPORT, SELECTOR and LOAD operator. However, I am unable to do it for STREAM. I get the following erorr when I try it on the same lines of LOAD. Please find the script and the exact error.
 
Script
USING CHARACTER SET @v_utf 

DEFINE JOB load_template 

(

  APPLY ('INSERT INTO MyTable;') TO OPERATOR ($STREAM() ATTR (TdpId=@v_tdpid, UserName=@v_userid, UserPassword=@v_password, MaxSessions=@v_sessions, DateForm = 'ANSIDATE', INTEGER Pack = @v_pack))

  SELECT * FROM OPERATOR($FILE_READER(DELIMITED @v_tablename) ATTR (FileName=@v_file_name, Format=@v_format, TextDelimiter=@v_delimiter_value, OpenMode = 'Read', IndicatorMode = @v_mode));

);

 

Error

 

TPT_INFRA: Syntax error at or near line 7 of Job Script File 'stream_template_delimited.out.ctl':

TPT_INFRA: At "INTEGER" missing { REGULAR_IDENTIFIER_ EXTENDED_IDENTIFIER_ EXTENDED_IDENTIFIER_NO_N_ } in Rule: Regular Identifier

TPT_INFRA: Syntax error at or near line 7 of Job Script File 'stream_template_delimited.out.ctl':

TPT_INFRA: TPT03247: Rule: Attribute Value Specification

TPT_INFRA: Semantic error at or near line 9 of Job Script File 'stream_template_delimited.out.ctl':

TPT_INFRA: TPT03111: Rule: Restricted APPLY Statement

Compilation failed due to errors. Execution Plan was not generated.

Job script compilation failed.

 

Please let me know if you need any additional information.

 

Thanks,

Naveen K

Forums: 

Need Help regarding TPT ODBC Operator

$
0
0

Hi All,

I create one ODBC Connection entry in ODBCINI file defined in .profile. OS is RHEL (Red Hat Enterprise Linux Server release 5.9 (Tikanga)). TPT Version is 14.10.00.08.

INFA_HOME=/opt/Informatica/PowerCenter9.6.0; export INFA_HOME
ODBCHOME=$INFA_HOME/ODBC7.1; export ODBCHOME
ODBCINI=$ODBCHOME/odbc.ini; export ODBCINI

Below is the ODBC entry. This is for Oracle source.

[VCMC_Oracle]
Driver=/opt/Informatica/PowerCenter9.6.0/ODBC7.1/lib/ddora27.so
Description=DataDirect 7.1 Oracle Wire Protocol
HostName=<HostName>
PortNumber=1521
ServiceName=<OracleServiceName>

Using ODBC and Data Connector Consumer operator, I'm trying to write the data into a adelimetted text file.

Below is my tpt script.

 

DEFINE JOB EXPORT_COL_BASE_TO_FILE
DESCRIPTION 'export EXPORT_COL_BASE_TO_FILE'
     (
        DEFINE SCHEMA SCHEMA_COL_BASE
            (
                                SYS_COL NUMBER(4),
				PRIN_COL NUMBER(4),
				AGNT_COL NUMBER(4),
				COLL_CODE_COL NUMBER(2),
				DELQ_FAMILY_COL CHAR(3),
				DELQ_FAMILY_DESCR_COL VARCHAR(25),
				DROP_DTE_COL VARCHAR(19),
				LS_WORK_DTE_COL VARCHAR(19),
				LS_TRAN_DTE_COL VARCHAR(19),
				NO_ACTS_COL	NUMBER(3),
				NO_MEMOS_COL NUMBER(3),
				REACTIVE_DTE_COL VARCHAR(19),
				SUB_ACCT_NO_COL CHAR(16),
				START_DTE_COL VARCHAR(19),
				WORK_DTE_COL VARCHAR(19)
			);

        DEFINE OPERATOR o_ODBCOper
        TYPE ODBC
        SCHEMA SCHEMA_COL_BASE
        ATTRIBUTES (
            VARCHAR UserName            = @UserName
           ,VARCHAR UserPassword        = @UserPassword
           ,VARCHAR DSNName             = @DSNName
           ,VARCHAR PrivateLogName      = 'loadlog'
           ,VARCHAR SelectStmt          = @SelectStmt
           ,VARCHAR TraceLevel          = 'all'
        );

        DEFINE OPERATOR o_FileWritter
        TYPE DATACONNECTOR CONSUMER
        SCHEMA SCHEMA_COL_BASE
        ATTRIBUTES (
         VARCHAR FileName               = @FileName
        ,VARCHAR Format                 = @Format
        ,VARCHAR TextDelimiter          = @TextDelimiter
        ,VARCHAR IndicatorMode          = 'N'
        ,VARCHAR OpenMode               = 'Write'
        ,VARCHAR PrivateLogName         = 'DataConnector'
        ,VARCHAR TraceLevel             = 'all'
        );
        APPLY TO OPERATOR (o_FileWritter[@LoadInst])
           SELECT * FROM OPERATOR (o_ODBCOper[@ReadInst]);
     )
     ;

Below is my tbuild command:

tbuild -f /home/aroy001c/Sample/ctl/col_base.tpt.ctl -v /home/aroy001c/Sample/logon/aroy001c_tpt.logon -u " UserName='XXXXX' , UserPassword='XXXXX' , DSNName='VCMC_Oracle' , load_op=o_ODBCOper , LoadInst=1 , ReadInst=1 , FileName='/home/aroy001c/Sample/tgtfile/col_base.out' , LOAD_DTS='2016-04-27 08:21:34' , Format='DELIMITED' TextDelimiter='$^$' , SkipRows=0 , SelectStmt='SELECT SYS_COL,PRIN_COL,AGNT_COL,COLL_CODE_COL,DELQ_FAMILY_COL,DELQ_FAMILY_DESCR_COL,DROP_DTE_COL,LS_WORK_DTE_COL,LS_TRAN_DTE_COL,NO_ACTS_COL,NO_MEMOS_COL,REACTIVE_DTE_COL,SUB_ACCT_NO_COL,START_DTE_COL,WORK_DTE_COL FROM COL_BASE;'" COL_BASE

When I'm running the tbuild command, I'm not able to coonect to source. 

[aroy001c@pacdcpaprdetl1 bin] tlogview -l /opt/teradata/client/14.10/tbuild/logs/COL_BASE-5847.out -f '*' -g

Public log:


Using memory mapped file for IPC

TPT_INFRA: TPT04101: Warning: Teradata PT cannot connect to Unity EcoSysetm Manager.
             The job will continue without event messages being sent to Unity EcoSystem Manager.
TPT_INFRA: TPT04190: Warning: OMD API failed to initialize
Found CheckPoint file: /opt/teradata/client/14.10/tbuild/checkpoint/COL_BASELVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter Executor Version 14.10.00.08
Teradata Parallel Transporter Coordinator Version 14.10.00.08
Teradata Parallel Transporter Executor Version 14.10.00.08
Teradata Parallel Transporter DataConnector Version 14.10.00.08
o_FileWritter: Instance 1 directing private log report to 'DataConnector-1'.
o_FileWritter: DataConnector Consumer operator Instances: 1
o_FileWritter: ECI operator ID: 'o_FileWritter-25430'
o_FileWritter: Operator instance 1 processing file '/home/aroy001c/Sample/tgtfile/col_base.out'.
Teradata Parallel Transporter ODBC Operator Version 14.10.00.08
o_ODBCOper: private log specified: loadlog-1
o_ODBCOper: connecting sessions
o_ODBCOper: TPT17122: Error: unable to connect to data source
o_ODBCOper: TPT17101: Fatal error received from ODBC driver:
              STATE=IM003, CODE=0,
              MSG='[DataDirect][ODBC lib] Specified driver could not be loaded'
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0
o_ODBCOper: disconnecting sessions
o_ODBCOper: TPT17124: Error: unable to disconnect from data source
o_ODBCOper: TPT17101: Fatal error received from ODBC driver:
              STATE=08003, CODE=0,
              MSG='[DataDirect][ODBC lib] Connection not open'
o_ODBCOper: Total processor time used = '0.01 Second(s)'
o_ODBCOper: Start : Tue May 10 08:21:24 2016
o_ODBCOper: End   : Tue May 10 08:21:24 2016
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0
o_FileWritter: Total files processed: 0.
Job step MAIN_STEP terminated (status 12)
Job COL_BASE terminated (status 12)
Job start: Tue May 10 08:21:20 2016
Job end:   Tue May 10 08:21:24 2016
Total available memory:          20000676
Largest allocable area:          20000676
Memory use high water mark:         45020
Free map size:                       1024
Free map use high water mark:          19
Free list use high water mark:          0

So, to create the ODBC DSN for Teradata to connect to Oracle, do I need to make the entry in some other place or I've made the entry in correct place?

Is there any tool to test the connection.

 

Thanks & Regards,

Arpan.

 

Forums: 

TPT how to combine parameter value with flat file data ?

$
0
0

Hi all,
 
I am using TPT to load flat files into TD, using OPERATOR IMPORT_OPERATOR TYPE LOAD. Like this:
        APPLY (
                'INS INTO ' || @StageDatabase || '.t_payment
                (                   
                    :BILLING_ACC_NO,
                    :DHL_INVOICE_NO,
                    :FISICAL_INVOICE_NO,
                    :PAYMENT_DATE,
                    :PAYMENT_REFERENCE,
                    :PAYMENT_MODE,
                    :INVOICE_DATE,
                    :DUE_DATE,
                    :PAYMENT_APPLY_DATE,
                    :PAID_AMOUNT,
                    :LOCAL_CURRENCY,
                    :LAST_MODIFIED_TIMESTAMP,
                    :INSERT_DTM        
                 );'
        )
        TO OPERATOR (import_operator[1])
        SELECT *
        FROM OPERATOR (data_connector[1]);
 
Now I need to add one more column in target which is not contained in data_connector[1] but it comes as a parameter value...
Can you please advise?
Thank you

Forums: 

Better performance than fastexport

$
0
0

Hi
I would like to know if TPT is faster than Fast export?
As we want to export 180TB data into a file system, currently through Fast eport we are achieving 1TB per day . Can I expedite the process so that I can meet my deadlines.
 
Thanks in advance

Forums: 

Executing multiple TPT scripts simultaneously will yield to better perfomance?

$
0
0

Hi
Can i execute multiple scripts with different producers(source tables) and different consumers(target files) simultaneously to get a better perfomance.
 
Thanks in advance

Forums: 

Scheduling stats manager collection jobs outside of Viewpoint?

$
0
0

Hi, let me start by saying I love the stats managment facility in Viewpoint.  The only problem I have is in scheduling. My users are running mini batches 24X7 from Informatica and managed by Tivoli Scheduler and I've had to integrate my backups into the schedule to avoid locking/performance issues. I now need to integrate the stats collection into the schedule too.  Is there the facility to start the stats manager jobs via the command line?  If so, I can shell script the job invocations and use Tivoli to manage the schedule

Forums: 

Query Grid 15.0 to 15.10

$
0
0

Hello All
 
Is it possible to look at system via Qrygrid where one of the 2 versions is running on 15.0 and 15.10 ?
Has anyone encountered any kind of errors when both the version varies?
 
regards
KN

Forums: 

DATA_Selector: TPT15112: Column #1 in schema is too small to hold 11 bytes of data

$
0
0

HI All,
 
I don't know where to look at for this error in my script.  Please help.

Tags: 
Forums: 

tdwallet not working

$
0
0

Using tdwallet version 15.10.00.02
I can not get $tdwallet to work in any way.  The tdwallet utility works and lets me add entries and delete entries.  But, in a bteq, the value is not substituted and I get a logon error.  For ease, I created an entry u with a value of the username and p with the password. 
Use tdwallet list to show the entries are there...
 >tdwallet list
u
p
I tried bteq in a script with the logon like this.
.logon mydatabase/$tdwallet(u),$tdwallet(p) ;
This generates
 *** Failure 8017 The UserId, Password or Account is invalid.
 *** Error: Logon failed!
 
If I try
.logon mydatabase/myuser,$tdwallet(p);      I get the same result.
 
If I try it interactively to test if tdwallet will supply the user and l supply the password after prompting; same result.
If in the script, I create a variable  LOGONCMD=".logon mydatabase/$tdwallet(u),$tdwallet(p) " and supply $LOGONCMD in the bteq inside the script...  same result. 
If I change $tdwallet(u) and $tdwallet(p) to the appropriate values, the bteq works as expected.
When I see the output trying to use $tdwallet, it always looks like it is evaluating a $tdwallet variable and finding this like an undefined script variable just leaving the (u) like it would outside of bteq.  I have changed the real database name to "mydatabase" in this example.
BTEQ 15.10.00.04 Tue May 17 20:42:47 2016 PID: 27672

+---------+---------+---------+---------+---------+---------+---------+----
.logmech TD2;
+---------+---------+---------+---------+---------+---------+---------+----
.LOGON mydatabase/(u),                            ( NOTE: I typed  $tdwallet(u)  here.  What is shown is the output)
Password:

 *** Failure 8017 The UserId, Password or Account is invalid.
 *** Error: Logon failed!

I know the user / password combination is correct.  I can substitute these and the logon works normally.  I have tried deleting the u and p entries from tdwallet and recreating them just to be certain the values are correct.  No change in the results.
I tried to turn on the debug information.  It isn't helpful...
CALL wallet constructor(locale="en") (tid=0xD0B6CFF7)
wallet constructor RETURNS (wallet=0x894D7C8,tid=0xD0B6CFF7)
CALL getVersion() (wallet=0x894D7C8,tid=0xD0B6CFF7)
getVersion RETURNS "15.10.00.02" (wallet=0x894D7C8,tid=0xD0B6CFF7)
CALL wallet destructor() (wallet=0x894D7C8,tid=0xD0B6CFF7)
wallet destructor RETURNS (tid=0xD0B6CFF7)
 
Can someone please help me figure out what is happening / missing.  Other posts about tdwallet imply it's easy to get set up and use.  Maybe I'm missing something simple here.  Maybe a config issue.
 
 

Forums: 

FEXP_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement

$
0
0

Any help will be much appreciated.  Thank you.
 
Teradata Parallel Transporter Version 14.00.00.04
Job log: /apps/tpt/Datos/TDA_DESA/Tptlogs/root-924.out
Job id is root-924, running on TDExpress1403_Sles10
Teradata Parallel Transporter Update Operator Version 14.00.00.04
MLOAD_OPERATOR_150: private log not specified
Teradata Parallel Transporter Export Operator Version 14.00.00.04
FEXP_OPERATOR: private log not specified
FEXP_OPERATOR: connecting sessions
MLOAD_OPERATOR_150: connecting sessions
FEXP_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement
FEXP_OPERATOR: disconnecting sessions
FEXP_OPERATOR: Total processor time used = '0.08 Second(s)'
FEXP_OPERATOR: Start : Wed May 18 11:45:07 2016
FEXP_OPERATOR: End   : Wed May 18 11:45:09 2016
MLOAD_OPERATOR_150: preparing target table(s)
MLOAD_OPERATOR_150: entering Acquisition Phase
MLOAD_OPERATOR_150: disconnecting sessions
MLOAD_OPERATOR_150: Total processor time used = '0.19 Second(s)'
MLOAD_OPERATOR_150: Start : Wed May 18 11:45:07 2016
MLOAD_OPERATOR_150: End   : Wed May 18 11:45:11 2016
Job step Apply_MLoad terminated (status 12)
Job root terminated (status 12)
 

Tags: 
Forums: 

MLOAD - UTY4014 Access module error '43' received during 'set position' operation

$
0
0

Hello,

The script is working fine until i have receive this error.

**** 12:01:11 UTY0817 MultiLoad submitting the following request:
     BEGIN TRANSACTION;
**** 12:01:11 UTY0817 MultiLoad submitting the following request:
     CHECKPOINT LOADING INTERVAL 0;
**** 12:01:11 UTY0817 MultiLoad submitting the following request:
     CHECKPOINT LOADING INTERVAL 0;
**** 12:01:12 UTY4014 Access module error '43' received during 'set position'
     operation: 'Restart data signature does not match'
**** 12:01:12 UTY1821 Aquisition Phase statistics
     Elapsed time:  00:00:01
     CPU time:      0 Seconds
     MB/sec:        0
     MB/cpusec:     N/A
    
MLOAD is being call in a script.  This is a simple MLOAD job.  This has been tested on a different server, using the same permissions and details without any error at all.
This error occurs in the MultiLoad Acquisition Stage.  I would appreciate any kind of help or push in the right path.

Thank you.

Forums: 

Set TimeZoneString value using tdlocaledef or dbscontrol utilities

$
0
0

HI,
I want to set TimZone String value using the tdlocaledef utility. Current value is (got using 'dbscontrol' utility with 'display general'):
18. System TimeZone String         = Australia Eastern
I tried as below to set : 
# tdlocaledef -input sdf.txt

6766: Missing entries in SDF

 

the content of sdf.txt
-------------------------

TimeZoneString {"GMT-8"; "-8"; "0"}

 

even tried with other contents:

1)TimeZoneString {"GMT-8"; "-8"; "0"}

 

2)TimeZoneString {""}

 

3)TimeZoneString {"Australia Eastern"; "10"; "0"; "6"; "3"; "10"; "0"; "0"; "-1"; "02:00:00"; "4"; "3"; "15"; "0"; "0"; "03:00:00"; "1987"; "1990"; "10"; "0"; "11"; "0"; "3"; "10"; "0"; "0"; "-1"; "02:00:00";

"4"; "3"; "1"; "0"; "0"; "03:00:00"; "1991"; "1994"; "10"; "0"; "11"; "0"; "3"; "10"; "0"; "0"; "-1"; "02:00:00"; "3"; "3"; "0"; "0"; "-1"; "03:00:00"; "1995"; "2005"; "10"; "0"; "11"; "0"; "3"; "10"; "0"; "

0"; "-1"; "02:00:00"; "4"; "4"; "1"; "0"; "0"; "03:00:00"; "2006"; "2006"; "10"; "0"; "11"; "0"; "3"; "10"; "0"; "0"; "-1"; "02:00:00"; "3"; "3"; "0"; "0"; "-1"; "03:00:00"; "2007"; "2007"; "10"; "0"; "11";

"0"; "4"; "10"; "1"; "0"; "0"; "02:00:00"; "4"; "4"; "1"; "0"; "0"; "03:00:00"; "2008"; "9999"; "10"; "0"; "11"; "0"}

 

 

But getting errors, am i trying anything wrong here, whats the correct way to set the timeZone value.

 

thanks

Chandra

 

 

 

Forums: 

Need help in using SQL Assistant

$
0
0

Hi All,
 
Two questions-
1. Our team is connecting to DB using SQL assistant. We connect to both dev and prod. Recently one of our team member by mistake connected to prod and deleted data thinking that he is doing it in dev. We would like to avoid such human mistakes. Any ideas in this regard are much appreciated.
 
2 .What we thought is actually whenever we connect to prod, it should display the screen in some color so that we will clearly identify that it is Prod. and Whenever we are running some delete or drop in Prod it should show us some kind of warning message (like in case of windows) before executing. Does this make sense? Please let me know.
 
Thanks, Prasanth

Forums: 
Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>