Quantcast
Channel: Teradata Downloads - Tools
Viewing all 870 articles
Browse latest View live

TPT load Varchar(MAX) LOB from SQL Server to Teradata

$
0
0

Hi everyone.  I'm trying to load SQL server tables to Teradata14 using OleDB via TD's OleLoad tool.  I'm having trouble with attributes defined as VARCHAR(MAX) in SQL server - it seems that this is a LOB data type.  Here is the script that OleLoad is generating:

USING CHAR SET UTF8
DEFINE JOB MyJob
(
  DEFINE SCHEMA MySchema
  (
    ProjectID        VARCHAR(36),
    WBSElement       VARCHAR(72),
    Network          INTEGER,
    Activity         SMALLINT,
    Item             SMALLINT,
    BOM              VARCHAR(12),
    ItemText1        VARCHAR(120),
    LongText         VARCHAR(63999),
    ReserveNo        INTEGER,
    Material         INTEGER,
    MaterialText     VARCHAR(120),
    "Category"       VARCHAR(75),
    ItmCat           VARCHAR(3),
    Status           VARCHAR(24),
    ResPurcReq       VARCHAR(3),
    PurchReq         INTEGER,
    PO               VARCHAR(3),
    ProfCenter       VARCHAR(21),
    UnloadingPoint   VARCHAR(75),
    OUn              VARCHAR(9),
    Mvt              VARCHAR(3),
    Mvt2             VARCHAR(9),
    PGp              VARCHAR(9),
    A                VARCHAR(3),
    Vendor           VARCHAR(30),
    StorageLoc       VARCHAR(12),
    Rd               VARCHAR(3),
    TimeUnit         VARCHAR(9),
    GLAccount        VARCHAR(18),
    MaterialGroup    VARCHAR(18),
    QtyUnE           FLOAT,
    UnE              VARCHAR(9),
    RequirementsQty  FLOAT,
    BUn              VARCHAR(9),
    RequirementsDate VARCHAR(30),
    QtyReceived      FLOAT,
    QtyWithdrawn     FLOAT,
    ShortfallQty     FLOAT,
    DelTime          VARCHAR(9),
    QtyAvailable     FLOAT,
    PriceLCurrency   DECIMAL(19,4),
    Per              VARCHAR(3),
    LtstReDate       VARCHAR(30),
    OperLTO          VARCHAR(9),
    GRT              VARCHAR(3),
    AutoID           INTEGER
  );
  DEFINE OPERATOR DDLOperator()
  TYPE DDL
  ATTRIBUTES
  (
    VARCHAR PrivateLogName = 'ddl_log',
    VARCHAR TdpId = @MyTdpId,
    VARCHAR UserName = @MyUserName,
    VARCHAR UserPassword = @MyPassword,
    VARCHAR WorkingDatabase = @MyDatabase
  );
  DEFINE OPERATOR DataConnOper()
  TYPE DATACONNECTOR PRODUCER
  SCHEMA MySchema
  ATTRIBUTES
  (
    VARCHAR AccessModuleName = 'Oledb_Axsmod',
    VARCHAR AccessModuleInitStr = 'noprompt jobid=1',
    VARCHAR FileName = 'Untitled',
    VARCHAR Format = 'Formatted',
    VARCHAR EnableScan = 'No',
    VARCHAR IndicatorMode = 'Yes',
    VARCHAR PrivateLogName = 'producer_log'
  );
  DEFINE OPERATOR MyConsumer()
  TYPE LOAD
  SCHEMA MySchema
  ATTRIBUTES
  (
    VARCHAR DateForm = 'IntegerDate',
    VARCHAR ErrorTable1 = 'ProjectComponent_errors1',
    VARCHAR ErrorTable2 = 'ProjectComponent_errors2',
    VARCHAR LogTable = '"CONOCO_SAP"."ProjectComponent_Log"',
    VARCHAR PrivateLogName = 'consumer_log',
    VARCHAR TargetTable = '"ProjectComponent"',
    VARCHAR TdpId = @MyTdpId,
    VARCHAR UserName = @MyUserName,
    VARCHAR UserPassword = @MyPassword,
    VARCHAR WorkingDatabase = @MyDatabase
  );
  STEP create_the_table
  (
    APPLY
    ('CREATE MULTISET TABLE "ProjectComponent" ( ProjectID        VARCHAR(12) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 WBSElement       VARCHAR(24) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 Network          INTEGER NOT NULL ,
                                                 Activity         SMALLINT NOT NULL ,
                                                 Item             SMALLINT NOT NULL ,
                                                 BOM              VARCHAR(4) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 ItemText1        VARCHAR(40) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 LongText         VARCHAR(32000) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 ReserveNo        INTEGER ,
                                                 Material         INTEGER ,
                                                 MaterialText     VARCHAR(40) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 "Category"       VARCHAR(25) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 ItmCat           CHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 Status           VARCHAR(8) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 ResPurcReq       CHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 PurchReq         INTEGER ,
                                                 PO               VARCHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 ProfCenter       VARCHAR(7) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 UnloadingPoint   VARCHAR(25) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 OUn              VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 Mvt              VARCHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 Mvt2             CHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 PGp              VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 A                VARCHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 Vendor           VARCHAR(10) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 StorageLoc       VARCHAR(4) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 Rd               VARCHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 TimeUnit         VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 GLAccount        VARCHAR(6) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 MaterialGroup    VARCHAR(6) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 QtyUnE           FLOAT ,
                                                 UnE              VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 RequirementsQty  FLOAT ,
                                                 BUn              VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 RequirementsDate VARCHAR(10) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 QtyReceived      FLOAT ,
                                                 QtyWithdrawn     FLOAT ,
                                                 ShortfallQty     FLOAT ,
                                                 DelTime          VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 QtyAvailable     FLOAT ,
                                                 PriceLCurrency   DECIMAL(19,4) NOT NULL ,
                                                 Per              CHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 LtstReDate       VARCHAR(10) CHARACTER SET UNICODE CASESPECIFIC  ,
                                                 OperLTO          VARCHAR(3) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 GRT              CHAR(1) CHARACTER SET UNICODE NOT NULL CASESPECIFIC  ,
                                                 AutoID           INTEGER NOT NULL );')
      TO OPERATOR (DDLOperator);
  );
  STEP load_the_data
  (
    APPLY
    ('INSERT INTO "ProjectComponent" ( :ProjectID, :WBSElement, :Network, 
                                       :Activity, :Item, :BOM, :ItemText1, 
                                       :LongText, :ReserveNo, :Material, 
                                       :MaterialText, :"Category", 
                                       :ItmCat, :Status, :ResPurcReq, 
                                       :PurchReq, :PO, :ProfCenter, 
                                       :UnloadingPoint, :OUn, :Mvt, :Mvt2, 
                                       :PGp, :A, :Vendor, :StorageLoc, 
                                       :Rd, :TimeUnit, :GLAccount, 
                                       :MaterialGroup, :QtyUnE, :UnE, 
                                       :RequirementsQty, :BUn, 
                                       :RequirementsDate, :QtyReceived, 
                                       :QtyWithdrawn, :ShortfallQty, 
                                       :DelTime, :QtyAvailable, 
                                       :PriceLCurrency, :Per, :LtstReDate, 
                                       :OperLTO, :GRT, :AutoID );')
      TO OPERATOR (MyConsumer)
      SELECT * FROM OPERATOR (DataConnOper);
  );
);

I have tried the following:
1) Setting LongText to VARCHAR(31000) in the schema definition and table definition.  This loaded the data, and I could potentially be okay with this BUT - If I set it to anything larger than VARCHAR(31000) i get error 3933 - max possible row length in table is too large.  I have another table with 2 attributes of type VARCHAR(MAX), so this won't work.
2) Setting the data type of the attribute in the target table to CLOB.  This builds the table and reaches the acquisition phase but results in error 3798 - column or character expression is larger than the max size.
Any help is appreciated.

Forums: 

Unable to use twbcmd to terminate a job is a FileReader is using Vigil to wait on new files

$
0
0

Hello,
I have a process that uses TPT Stream to load from a FileReader producer that is using the Vigil properties to continuously scan a directory.  The Vigil time is set to many hours to reduce any load delay caused by TPT startup, as new files are constantly being written.
I am having difficulty finding a way to cleanly end the process while it is running.
I was trying to use twbcmd, but I think I may be encountering a bug.  `twbcmd ${JOB_ID} JOB TERMINATE`successfully commits the buffers and performs the necessary checkpoints when there are files remaining that match the FileReaders' FileName pattern (with wildcard).
However, when there are no files in the directory that match the FileName pattern, the JOB TERMINATE command is ignored, indefinitely.  This is the last line in the tlogview logs:
Processing "JOB TERMINATE" user command
Performing checkpoint prior to terminating job

The process will wait in this state until new files (one per instance) matching the FileName pattern arrives (or the Vigil time expires).  In the former case (such as if I copy files matching the pattern into the directory), the FileReader operators wake up and start processing the files, but then immedately acknowledge the checkpoint signal:
Processing "JOB TERMINATE" user command
Performing checkpoint prior to terminating job

FileReader: TPT19222 Operator instance 1 processing file '[Filename1, removed]'.
FileReader: TPT19222 Operator instance 2 processing file '[Filename2, removed]'.
FileReader: TPT19222 Operator instance 3 processing file '[Filename3, removed]'.
FileReader: TPT19222 Operator instance 4 processing file '[Filename4, removed]'.
Task(SELECT_2[0002]) ready to take internal checkpoint
Task(SELECT_2[0003]) ready to take internal checkpoint
Task(SELECT_2[0004]) ready to take internal checkpoint
Task(SELECT_2[0001]) ready to checkpoint

Task(SELECT_2[0002]): checkpoint completed, status = Success
Task(SELECT_2[0004]): checkpoint completed, status = Success
Task(SELECT_2[0001]): checkpoint completed, status = Success
Task(SELECT_2[0003]): checkpoint completed, status = Success

However, this is always followed by a checkpoint error that triggers a TPT restart.

TPT_INFRA: TPT02258: Error: Operator checkpointing error, status = Retry Error

Task(APPLY_1[0001]): checkpoint completed, status = Retry Error

TPT_INFRA: TPT03720: Error: Checkpoint command failed with 47

TPT_INFRA: TPT02255: Message Buffers Sent/Received = 89752, Total Rows Received = 1822251, Total Rows Sent = 0

TPT_INFRA: TPT02255: Message Buffers Sent/Received = 22422, Total Rows Received = 0, Total Rows Sent = 455158

TPT_INFRA: TPT02255: Message Buffers Sent/Received = 22429, Total Rows Received = 0, Total Rows Sent = 455226

TPT_INFRA: TPT02255: Message Buffers Sent/Received = 22537, Total Rows Received = 0, Total Rows Sent = 458020

TPT_INFRA: TPT02255: Message Buffers Sent/Received = 22368, Total Rows Received = 0, Total Rows Sent = 453847

**** 12:00:16 RDBMS CRASHED OR RETRYABLE CONDITION OCCURRED.

              JOB WILL BE RESTARTED.

 
After the restart, the TPT job continues rather than terminating.
Is there anything that I am doing wrong, with respect to either TPT or twbcmd?  Is there any other way that I can cleanly terminate a TPT process that is using a FileReader with Vigil, even if this may occur when the FileReader has  no files to read?
TPT version: 14.00.00.08
OS: Sun OS 5.10 / Solaris 10
Thanks!
TPT_INFRA: TPT02258: Error: Operator checkpointing error, status = Retry ErrorTask(APPLY_1[0001]): checkpoint completed, status = Retry ErrorTPT_INFRA: TPT03720: Error: Checkpoint command failed with 47

Forums: 

Compare system configuration in 2 differen tsystems

$
0
0

Is there a tool which enables the comparison of system configuration in 2 different systems?

Forums: 

64-bit TPT Tools versus 32-bit TPT Tools

$
0
0

Hello everyone,
 
In installing TPT v14, I noticed that there appeared to be a 32-bit version and a 64-bit version of the utilities.  I was wondering if there were any pros or cons of using one version versus the other (we operate in a 64-bit Windows OS). 
If it helps, we generally are using the Stream, Load, DDL, and DataConnector operators. 
Many thanks in advance.

Forums: 

Loading a file into a table using Multiload

$
0
0

Hello !
I create this topic because I have a question relative to the loading of a file into a table in Teradata. More specifically, my question concerns a certain type of value of a field in a file to insert into a table, the NULL value.
Let's say I have the following table T1 :

CREATE MULTISET TABLE T1
(
field1 VARCHAR(50)
, field2 VARCHAR(50)
);

I want to insert a NULL value into field2. What kind of value do I have to put in my file that I want to load (by using multiload) into this table ?
My file F is like this : two fields and the "|" character as separator.

line1_field1|line1_field2
line2_field1|line2_field2

I tried to let it empty, like this :

L1F1|
L2F1|L2F2

But it does not work. When I try to load the file by using multiload into Teradata table, I have the following error message :

**** 12:03:39 UTY4017 Not enough data in vartext record number 7855.

Hope this is not a too noobish question :D. Sorry if it is >_<.
Best regards.
Gwenael Le Barzic

Forums: 

teradata utility

$
0
0

HI all,
we are using API and TPUMP to update in abinitio.if records are below 5000,it uses API mode to update.if greater than 5000 it uses TPUMP to update.
API mode is working fine.but TPUMP is not working.when records are more than 5000,job is failing.
can any one please suggest on this.how we can update records more than 5000 using terdata utylity?
 
Thanks,
Samyuktha

Forums: 

Can the Linux ODBC driver handle array inserts?

$
0
0

I'm using the odbc driver provided by Teradata to create tables on a remote database server.  If my data file has 50K rows, when I do the table creation it is showing up on the remote server as 50K separate singleton inserts, which is very inefficient.  Is there a way to configure the driver so that it does an array insert instead of a large number of single inserts?
Thanks
 

Forums: 

TPT SCHEMA MAPPER OPERATOR example please

$
0
0

Hello All,
I can't find an example anywhere of the TPT Schema Mapper Operator in Action.  Could someone please post an example.
The $SCHEMAP.txt template file in the TPT Install directory has the following in the comments:
......             the "Schema Mapper Operator" chapter   */
/*               in the Teradata Parallel Transporter User Guide,    */
/*               for a discussion and examples of using its key      */
/*               attributes. 
There is no such chapter.   How is it "Applied" ?
Thanks
 
 

Forums: 

FastLoad

$
0
0

Can anyone please let me know the drivers (JDBC, OLEDB, ODBC) supported by FastLoad, FastExport and MultiLoad?

Forums: 

An error occurred during move data process -115

$
0
0

Hi,
can anybody explain this why i am getting this error. When i am trying to install TD 13.10 SQL Assistant in my system then i am getting below error
 
An error occured during move data process -115
File group: Queryman
File:

c:\Users\.....\AppData\Local\Temp\_ISTMP2.DIR\_ISTMP0.DIR\Teradata_Sql_assitant\autorun.inf

 

 

Thanks in advance 

Forums: 

TPT time zone

$
0
0

How do you set the session time zone in a TPT script. 
In a FASTLAOD script, the command was:
set time zone 'America Pacific’

Forums: 

TTU programs fail to open

$
0
0

Hello everyone,
 
I've been spending quite a few days trying to solve this problem. Recently all my load unload tools stopped working. When I try to open fastload it gives me a fastload.exe application error. "The application was unable to start correctly (0xc000007b)". The same happens when I try to run fastload scripts through python. This used to work before. I did have to install 32bit python recently to connect to an Oracle database, but nothing else changed as far as I know (my system is 64bit windows 7). I've tried reinstalling the TTU tools but it doesnt seem to fix the problem. 
 
Thanks for your help

Forums: 

BTEQ - want to Generate a record number based on order of rows inserted that is sequential

$
0
0

I am trying to load a very wide file into a Teradata table using BTEQ and unfortunately the records do not have a unique identifier other than record number which is implied by their location in the file (ie. not an explicite record number field).   I would like to generate a number on insert that matches the order the records are inserted.   I tried using CSUM(1,1) but it gives me an error, invalid reference of table data.  Identity columns won't help unless I use that to then build the record numbers based on order after the load but that is messy.   Is there anyway to just generate a sequential record or row number on insert using BTEQ?

Forums: 

TMM - Mapping Save Performance

$
0
0

I am currently using TMM 2.1.6, to perform an auto-find mappings between a source and a target.  The application was able to automatically find 200+ mappings within a few seconds, but when I try to save those mappings, it is taking approximately 30+ minutes to complete.  Is there anyway for us to speed up the performance when the mappings are being saved? 
For reference, our TMM repository housed on a network database and we will be unable to convert that to a local database.

Forums: 

Mload error - 2794 - UPI is an identity column

$
0
0

Greetings Experts,
Recently we are facing a recurring Informatica job failure with "External Loader Error" with UPI defined on the identity column with max value 2147483646 (Mload defined on the staging table with UPI on an identity column called "id").  We could see that the max value for the column loaded into this Mload staging table for this load (400000000) is well less than the max value.  From the UV table we could see that the dbcerror code is 2794, indicating duplicates on the PI.
Are the duplicates being generated from the identity column id?  If yes, why does identity column generate duplicates (we have around 35 records in UV table, all this 35 PI value also exists in the target staging table there by causing duplicates on PI column)
If not, what might be the issue leading to the failure.
Thanks for your time on this.

Forums: 

Studio Express installer can't find java

$
0
0

I'm installing Studio Express nt-x8664 on Windows 7 (64bit). In a command window, "java -version" returns "java version 1.7.0_25".
At the installer window with the text "Select Java Runtime Environment", I have tried every possible path for the "choose JRE destination folder" prompt, but the installer fails with the message "64bit Java Runtime Environment 1.6 or above is required. Please provide the required JRE path".
The java download page told me I was installing the 64-bit version.
Help, please.

Forums: 

Teradata Fastload has stopped working

$
0
0

Hi,
I'm testing the Fastload utility. But I received a pop-up message at the end of the process,
"Teradata Fastload has stopped working.  A problem caused the program to stop wroking correctly. Windows will close the program and notify you if a solution is available."
The scipt I used is:

SESSIONS 5; 
TENACITY 5; 
SLEEP 5; 
ERRLIMIT 50;
.logon xxxxx/yyyyy,zzzzzz;
DROP TABLE p_kocac_t.fastload_test;


create table p_kocac_t.fastload_test 
( 
Year_Week_Number char(15),
Relative_Week char(15),
Hierarchy_Code char(15),
Daily_Target char(15)) 
unique primary index(Year_Week_Number ,Relative_Week, Hierarchy_Code); 


DROP TABLE p_kocac_t.fastload_test_ET; 
DROP TABLE p_kocac_t.fastload_test_UV; 


.SET RECORD VARTEXT "|"; 

DEFINE 
Year_Week_Number (VARCHAR(15)), 
Relative_Week (VARCHAR(15)), 
Hierarchy_Code (VARCHAR(15)), 
Daily_Target (VARCHAR(15)) 

FILE=data.csv;

SHOW;
BEGIN LOADING p_kocac_t.fastload_test ERRORFILES p_kocac_t.fastload_test_ET , p_kocac_t.fastload_test_UV
CHECKPOINT 1000;
INSERT INTO p_kocac_t.fastload_test VALUES(
:Year_Week_Number,
:Relative_Week,
:Hierarchy_Code,
:Daily_Target); 

END LOADING; 
.LOGOFF; 
.QUIT;

In file "data.csv", there is only one record for testing purpose:

Year_Week_Number|Relative_Week|Hierarchy_Code|Daily_Target

 And here is the log:

     ===================================================================
     =                                                                 =
     =          FASTLOAD UTILITY     VERSION 13.10.00.003              =
     =          PLATFORM WIN32                                         =
     =                                                                 =
     ===================================================================

     ===================================================================
     =                                                                 =
     =          Copyright 1984-2010, Teradata Corporation.             =
     =          ALL RIGHTS RESERVED.                                   =
     =                                                                 =
     ===================================================================

**** 09:19:19 Processing starting at: Wed Aug 21 09:19:19 2013

0001 SESSIONS 5;

**** 09:19:19 FDL4866 SESSIONS command accepted

0002 TENACITY 5;

**** 09:19:19 Tenacity Enabled:  5 hour(s)

0003 SLEEP 5;

**** 09:19:19 Sleep Minutes Set: 5 minute(s)

0004 ERRLIMIT 50;

**** 09:19:19 Error limit set to: 50

     ===================================================================
     =                                                                 =
     =          Logon/Connection                                       =
     =                                                                 =
     ===================================================================

0005 .logon xxxxx/yyyyy,

**** 09:19:25 Teradata Database Release: 14.00.03.503
**** 09:19:25 Teradata Database Version: 14.00.03.502
**** 09:19:25 Current CLI or RDBMS allows maximum row size: 64K
**** 09:19:25 Character set for this job: ASCII

0006 DROP TABLE p_kocac_t.fastload_test;

**** 09:19:38 Command completed successfully


0007 create table p_kocac_t.fastload_test
     (
     Year_Week_Number char(15),
     Relative_Week char(15),
     Hierarchy_Code char(15),
     Daily_Target char(15))
     unique primary index(Year_Week_Number ,Relative_Week, Hierarchy_Code);

**** 09:19:44 Command completed successfully


0008 DROP TABLE p_kocac_t.fastload_test_ET;

**** 09:19:50 Command completed successfully

0009 DROP TABLE p_kocac_t.fastload_test_UV;

**** 09:20:03 Command completed successfully


0010 .SET RECORD VARTEXT "|";

**** 09:20:03 Now set to read 'Variable-Length Text' records
**** 09:20:03 Delimiter character(s) is set to '|'
**** 09:20:03 Command completed successfully


0011 DEFINE
     Year_Week_Number (VARCHAR(15)),
     Relative_Week (VARCHAR(15)),
     Hierarchy_Code (VARCHAR(15)),
     Daily_Target (VARCHAR(15))
     
     FILE=data.csv;

**** 09:20:03 FDL4803 DEFINE statement processed


 And then a error message box pop up, "Teradata Fastload has stopped working....."
And if I want to select table p_kocac_t.fastload_test, I get the feedback "operation not allowed, p_kocac_t.fastload_test is being loaded".
 
Is there anyone who met the similar issue? 
Any advice?
 
 
 
 
Regards,
Eminent
 
 

Forums: 

TTU Client install on Linux as non-root user and relocatable

$
0
0

Hi, I need to install the TTU software on Linux in a non-standard location which I can pass in to the RPM package using --prefix. I will need to have multiple versions of TTU for testing. I don't want to install the software into the root file system. I need the ability to install without being the root user because we don't get that access level on any server.
Is TTU available in a way that it will install according to my needs? I need TTU 13 and 14.
Thank you,
Mike

Forums: 

SQL Assistant Change Password

SQL Assistant Database Explorer Obsolete Databases

Viewing all 870 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>