Hi Balaji Natarajan,
Not using the native tools, but there are sftp clients that can be used with CPS.
What platform are we talking about ? I assume Windows, in which case you would want to look at PSFTP: PuTTY Download Page
Regards,
HP
Hi Balaji Natarajan,
Not using the native tools, but there are sftp clients that can be used with CPS.
What platform are we talking about ? I assume Windows, in which case you would want to look at PSFTP: PuTTY Download Page
Regards,
HP
Hello,
I wrote a script which should create one job per each row in a table. The table has two columns (key and parameter).
The script works only for the first value. Afterwards the job goes to error with the error message "Trying to use non-attached object TableValue".
May you help me to find the failure?
Source code:
import java.util.Iterator;
import com.redwood.scheduler.api.model.*;
import com.redwood.scheduler.api.model.enumeration.*;
import com.redwood.scheduler.api.scripting.variables.ScriptSessionFactory;
{
String vTableName = "TABLE1";
SchedulerSession session = ScriptSessionFactory.getSession();
Table importTable = session.getTableByName(vTableName);
if(importTable == null)
{
jcsOut.println("Table " + vTableName + " does not exist");
return;
}
// Iterator for looping through the table
Iterator iterTV = importTable.getTableValues();
while(iterTV.hasNext())
{
// Get the single table value
TableValue tv = (TableValue)iterTV.next();
// Check, if the table value belongs to a specific column
if (tv.getColumnName().equals("PARAMETER"))
{
// Get the key to identify the table row
String vkey = tv.getKey();
String strParameter = importTable.getTableValueBySearchKeySearchColumnName(vkey,"PARAMETER").getColumnValue();
// Get the job definition
JobDefinition jd = jcsSession.getJobDefinitionByName("TEST_JOBDEFINITION");
// Prepare a job based on the job definition
Job job = jd.prepare();
// Provide the other parameter values to the job
jcsOut.println("JOBNAME: " + strParameter);
job.getJobParameterByName("p_JobName").setInValueString(strParameter);
job.getJobParameterByName("jdpName").setInValueString("PRINT_PDEST");
job.getJobParameterByName("replaceValue").setInValueString("");
job.getJobParameterByName("p_ChangeJobDef").setInValueString("Y");
job.getJobParameterByName("p_TestRun").setInValueString("Y");
jcsOut.println("");
jcsSession.persist();
jcsSession.waitForJob(job);
jcsSession.refreshObjects(new SchedulerEntity[]{jd});
}
}
jcsOut.println("All requests from Table " + vTableName + " processed.");
}
Best regards
Dana
Hello,
This is only required on the job definitions that actually have an action defined.
Regards Gerben
Hello Dana,
This has to do with the refreshing of the session, this removes all retrieved objects from the session, also the entries that are still in your Iterator.
It is better to setup two SchedulerSessions, one for the query and one for the submit. Or do the query, store the items in a list and do a loop on that.
Regards Gerben
PS: a small improvement:
String strParameter = importTable.getTableValueBySearchKeySearchColumnName(vkey,"PARAMETER").getColumnValue();
can be replaced with:
String strParameter = tv.getColumnValue();
Because you already have the correct TableValue selected here.
Hello Gerben,
I thought session and jcsSession are already 2 sessions!?
I created a second session but got the same error message. Do I use the sessions in a wrong way?
import java.util.Iterator;
import com.redwood.scheduler.api.model.*;
import com.redwood.scheduler.api.model.enumeration.*;
import com.redwood.scheduler.api.scripting.variables.ScriptSessionFactory;
{
String vTableName = "TABLE1";
SchedulerSession session1 = ScriptSessionFactory.getSession();
SchedulerSession session2 = ScriptSessionFactory.getSession();
Table importTable = session1.getTableByName(vTableName);
if(importTable == null)
{
jcsOut.println("Table " + vTableName + " does not exist");
return;
}
// Iterator for looping through the table
Iterator iterTV = importTable.getTableValues();
while(iterTV.hasNext())
{
// Get the single table value
TableValue tv = (TableValue)iterTV.next();
// Check, if the table value belongs to a specific column
if (tv.getColumnName().equals("PARAMETER"))
{
// Get the key to identify the table row
String vkey = tv.getKey();
String strParameter = tv.getColumnValue();
// Get the job definition
JobDefinition jd = session2.getJobDefinitionByName("TEST_JOBDEFINITION");
// Prepare a job based on the job definition
Job job = jd.prepare();
// Provide the other parameter values to the job
jcsOut.println("JOBNAME: " + strParameter);
job.getJobParameterByName("p_JobName").setInValueString(strParameter);
job.getJobParameterByName("jdpName").setInValueString("PRINT_PDEST");
job.getJobParameterByName("replaceValue").setInValueString("");
job.getJobParameterByName("p_ChangeJobDef").setInValueString("Y");
job.getJobParameterByName("p_TestRun").setInValueString("Y");
jcsOut.println("");
session2.persist();
session2.waitForJob(job);
session2.refreshObjects(new SchedulerEntity[]{jd});
}
}
jcsOut.println("All requests from Table " + vTableName + " processed.");
}
Best regards
Dana
Hello ,
I have daily job which runs at X:00 time in Redwood CPS and i want to change the time only for that 1 day of execution.
Then again to normal time. Is it possible to change . Any suggestions.
Regards,
Krishna
Hi Jana,
Could you please provide your contact details will call you for some information on SAP CPS.
Best Regards,
Manoj Maddukuri.
Hi Dana,
Yes you are probably right. It is possible that the ScriptSessionFactory.getSession() does not actually provide you with a new session.
Can you try SchedulerSession session = jcsJobContext.createSchedulerSession(); This is the preferred way in a RWScript.
Regards Gerben
Hello,
Go to the job and do Scheduling -> Edit, now in the pop-up select that you only want to change the current occurence, set the start time to the required time and you are good to go.
Regards Gerben
include a check for the file event definition,
select jc.* from JobChain jc,JobDefinition jd,JobDefinitionWaitEvent jw, FileEventDefinition fed"
+ " where jc.JobDefinition = jd.UniqueId"
+ " and jd.UniqueId = jd.MasterJobDefinition"
+ " and jd.UniqueId = jw.JobDefinition"
+ " and jd.UniqueId not in (select distinct JobChainCall.JobDefinition from JobChainCall)
+ "jw.EventDefinition = fed.EventDefinition"
thanks
Nanda
You can start from creating a query filter in your job monitor.
Then use that filter to create a report and run system_reportrun jobdefinition to get a CSV output.
thanks
Nanda
Many thanks, Gerben!
Now it works.
Best regards
Dana
Hi Gerben,
Thanks, Please suggest how can we use it effectively. Currently, we are doing through UNIX (Bash) Script for file transfer from UNIX to Window server (global space).
But, we have a issues with it i.e. return code issues and the jobs is failing with 10001 return code error but file is getting transferred successfully. We had raised a support ticket but they are working on it, not much progress on that. As part of workaround, we are forcefully completing the job. We are using Redwood version 9.
Please suggest.
Regards,
Abhishek
Hello All,
When we are trying to change the timezone or anything in user setting, we are getting below error message:
JCS-122035: Unable to persist: JCS-102200: Registry entry Locale, full path /user/userid@abc/ui/configuration/Locale already exists
at com.redwood.scheduler.model.SchedulerSessionImpl.writeDirtyListLocal(SchedulerSessionImpl.java:1012)
at com.redwood.scheduler.model.SchedulerSessionImpl.persist(SchedulerSessionImpl.java:938)
at com.redwood.scheduler.ui.model.impl.FormImpl.modelActions(FormImpl.java:855)
at com.redwood.scheduler.ui.model.impl.LifeCycle.run(LifeCycle.java:285)
at com.redwood.scheduler.ui.servlet.Servlet.handleRequest(Servlet.java:165)
at com.redwood.scheduler.ui.servlet.Servlet.doPost(Servlet.java:104)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:644)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:725)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.redwood.scheduler.security.filter.ChannelFilter.doFilter(ChannelFilter.java:98)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.redwood.scheduler.module.impl.ModuleFilter.doFilter(ModuleFilter.java:157)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.redwood.scheduler.security.filter.SecurityFilter.doFilter(SecurityFilter.java:803)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.redwood.scheduler.security.filter.HttpMethodFilter.doFilter(HttpMethodFilter.java:74)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.redwood.scheduler.servlet.RequestCharsetFilter.doFilter(RequestCharsetFilter.java:200)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:203)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at com.redwood.platform.context.RPValve.invoke(RPValve.java:30)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:537)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1085)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:658)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:222)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1556)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1513)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.redwood.scheduler.model.exception.UniqueConstraintViolatedException: JCS-102200: Registry entry Locale, full path /user/userid@abc/ui/configuration/Locale already exists
at com.redwood.scheduler.model.ModelSQLExceptionMappingImpl.wrapUniqueConstraintViolatedException(ModelSQLExceptionMappingImpl.java:46)
at com.redwood.scheduler.persistence.helper.PostgresGenericSQLExceptionWrapper.throwWrappedSQLExceptionIfRequired(PostgresGenericSQLExceptionWrapper.java:54)
at com.redwood.scheduler.persistence.helper.RecoverySQLExceptionWrapper.wrapSQLException(RecoverySQLExceptionWrapper.java:53)
at com.redwood.scheduler.persistence.impl.OuterPersistenceUnitOfWorkManager.rethrowException(OuterPersistenceUnitOfWorkManager.java:91)
at com.redwood.scheduler.persistence.impl.OuterPersistenceUnitOfWorkManager.execute(OuterPersistenceUnitOfWorkManager.java:43)
at com.redwood.scheduler.persistence.impl.LowLevelPersistenceImpl.writeDirtyObjectList(LowLevelPersistenceImpl.java:197)
at com.redwood.scheduler.cluster.persistence.ClusteredLowLevelPersistence.writeDirtyObjectList(ClusteredLowLevelPersistence.java:67)
at com.redwood.scheduler.model.SchedulerSessionImpl.writeDirtyListLocal(SchedulerSessionImpl.java:987)
... 44 more
Caused by: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "jcs_registry00"
Detail: Key (a_name, f_parent)=(Locale, 550505025) already exists.
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2157)
We are using redwood version 9 (9.0.14.3)
Regards,
Abhishek
Hi All,
I am trying to generate report for errored or killed jobs. Please help me how to modify the query in the attached screenshot to get the list of jobs belonging to only a specific application say 'FICA'.
2. Also how to modify the query in the screenshot to get only the list of jobs starting from a specific date.
I tried to create a query filter in the job monitor and use that filter to extract the job but it is not listing the child jobs of a job chain.
Thanks,
Tinku
Hello,
The return code issue has to do with a recent change in version 9.0.14. All CMD scripts that issue a 'exit n' command will fail with a return code of 10001. This is because the exit command will stop the job and skip some of the post processing.
The way to do this as described in the bottom of all CMD scripts is to issue 'exit /b n' instead.
Regards Gerben
We created a redwood script using Java modules
Job parameters:
SFTP_hostname | String | In | SFTP Hostname:Port |
SFTP_username | String | In | SFTP username |
SFTP_password | String | In | |
SFTP_localfilepath | String | In | Local file Path |
SFTP_remotefilepath | String | In | Remote File Path |
JobId | String | In | |
JobFile | String | In |
import com.redwood.scheduler.api.model.Job; import com.redwood.scheduler.api.model.JobFile; import java.io.File; import java.net.URLEncoder; import org.apache.commons.vfs2.FileObject; import org.apache.commons.vfs2.FileSystemException; import org.apache.commons.vfs2.FileSystemOptions; import org.apache.commons.vfs2.Selectors; import org.apache.commons.vfs2.impl.StandardFileSystemManager; import org.apache.commons.vfs2.provider.sftp.SftpFileSystemConfigBuilder; import org.apache.commons.net.*; import com.jcraft.jsch.*; { // Setup SFTP Connection string parameter String hostName = SFTP_hostname; //String hostName = "10.0.0.162:22"; String username = SFTP_username; String pws = SFTP_password; //String pws = "vjknfs8975h4wo"; String password = URLEncoder.encode(pws); //String localFilePath = "C:/file/testfile.txt"; String remoteFilePath = "/out/testfile.txt"; String remoteTempFilePath = "in/testTempFile.txt"; Long j_jobid = jcsJob.getJobId(); String j_filename = jf.getFileName(); Long j_filesize = jf.getSize(); String j_name = jf.getName(); //String localFilePath = j_filename; String localFilePath = "/opt/redwood/platform/j2ee/cluster/server1/log/scheduler/lis/560000-569999/562479/System_562479_1.txt"; jcsOut.println(j_jobid); jcsOut.println(j_filename); jcsOut.println(j_filesize); jcsOut.println(j_name); jcsOut.println("localFilePath = " + localFilePath); upload(hostName, usern } private String myjobFile(Long jobId, String jobFileName) { String returnValue = ""; Job j = jcsSession.getJobByJobId(jobId); JobFile jf = j.getJobFileByName(jobFileName); String fileName = jf.getFileName(); Long fileSize = jf.getSize(); return returnValue; } public static void upload(String hostName, String username, String password, String localFilePath, String remoteFilePath) { File file = new File(localFilePath); if (!file.exists()) throw new RuntimeException("Error. Local file not found"); StandardFileSystemManager manager = new StandardFileSystemManager(); try { manager.init(); // Create local file object FileObject localFile = manager.resolveFile(file.getAbsolutePath()); // Create remote file object FileObject remoteFile = manager.resolveFile(createConnectionString(hostName, username, password, remoteFilePath), createDefaultOptions()); /* * use createDefaultOptions() in place of fsOptions for all default * options - Ashok. */ // Copy local file to sftp server remoteFile.copyFrom(localFile, Selectors.SELECT_SELF); System.out.println("File upload success"); System.out.println(remoteFilePath); } catch (Exception e) { throw new RuntimeException(e); } finally { manager.close(); } } public static String createConnectionString(String hostName, String username, String password, String remoteFilePath) { return "sftp://" + username + ":" + password + "@" + hostName + "/" + remoteFilePath; } public static FileSystemOptions createDefaultOptions() throws FileSystemException { // Create SFTP options FileSystemOptions opts = new FileSystemOptions(); // SSH Key checking SftpFileSystemConfigBuilder.getInstance().setStrictHostKeyChecking(opts, "no"); /* * Using the following line will cause VFS to choose File System's Root * as VFS's root. If I wanted to use User's home as VFS's root then set * 2nd method parameter to "true" */ // Root directory set to user home SftpFileSystemConfigBuilder.getInstance().setUserDirIsRoot(opts, false); // Timeout is count by Milliseconds SftpFileSystemConfigBuilder.getInstance().setTimeout(opts, 10000); return opts; } jcsOut.println("Test script"); jcsOut.println(fileName); jcsOut.println(fileSize);
Thanks Gerben, yes..we current using the version 9.014.3. I will try it and let you know.
Many thanks for help.
Regards,
Abhishek
It makes sense , thanks for your help!!
I need to clear the event before raising it otherwise it will lead to two events raised and that's not the requirement so is there any redwood script or expression to check the status of event (i.e. whether it's raised or clear ) and accordingly I can clear please help.
waitEvents.EVT_RAISE_MG.raiser comment not providing the comment .
Please suggest