Monday, 2 June 2014

Scenarios Set 3

41. In a table, 4 persons having same salary. How to get 3rd person record only.
Eg:
         NAME                        SALARY
     A                              100
     B                              200
     C                              300
     D                              100
     E                              200
     F                              100
     G                              200
     H                              200   

42. "We need to stop the session once, when we encounter the status value as ?I? from source (highlighted below), but at the same time we need to load all records up to that point."                                                
Source :                    
ID
Name
Status
1
Gowtham
V
2
Raju
V
3
Anil
V
4
Javeed
V
5
Aarman
I
6
Anupam
V
7
Arora
V

Target :
ID
Name
Status
1
Gowtham
V
2
Raju
V
3
Anil
V
4
Javeed
V

43.

44. If I am having 1 record in source, i want 100 records in  target...how will you do it.

          JAVA: Type the following code in ON INPUT ROW of JAVA  CODE  tab. 

for ( int i=1 ;i<=100;i++)
{
OUT=IN;                    //  Output port name=Input port name
generateRow();
} 

45. I have one source table containing 30 records. Now I have to send 1st set of (1-3)  to 1st target, another set (4-6) to 2nd target and (7-9)to 3rd target……………..

46. I have one source table containing 30 records. Now I have to send 1st record to   1st target, 2nd record to second target,3rd record to third target…………4th record to   1st target, 5th  record to second target,6th record to third target………….. 

47. TO LOAD RANGE nth to mth RECORDS TO TARGET?

48. Scenario 
SOURCE        TARGET
------            -------
COL1            COL2
A                  E
B                  F
C
D
*
E
F
*
.
.
.
Means the data between two stars only need to be loaded into the target

49. Scenario: Daily I will get same structure file with different file names. I want to load them in a single target table, this need to be done without any using script. Here file name is not fixed, it gets changed everyday like below 
file_0420141
file_0420142
file_0420143

50. I have an Department table as below and I want to convert rows into columns & columns into rows using Normalizer and without Normalizer:

Source:
DEPTNO,DNAME,LOC
10,ACCOUNTING,NEW YORK
20,RESEARCH,DALLAS
30,SALES,CHICAGO
40,OPERATIONS,BOSTON


Target:
Field1,Field2,Field3,Field4
10,20,30,40
ACCOUNTING,RESEARCH,SALES,OPERATIONS
NEW YORK,DALLAS,CHICAGO,BOSTON


51. I have a table having columns eno, ename, country where column country is indicate location.
eno,ename,country
1,aaa,india
2,bbb,america
3,ccc,india
4,ddd,london
5,eee,london
6,fff,india

If the file arrived between in 12:00am-11:59 am then it should load only India related records, if the file arrived at noon(12:00pm-05:59 pm) it should load only America related records and if the file arrived at evening (06:00pm-11:59 pm) it should load London's related records.


52. I have source customer and lookup customer Account.

Source: Customer
Customer_Name|PAN|Account_No
A| AUP|1
B|         |4
C|NET |7
D|         |9


Lookup: Customer Account:
PAN|Account_No
AUP|1
AUP|2
AUP|3
BAC|4
BAC|5
BAC|6
NET|7
NET|8
DAC|9

Output:
Customer_Name|PAN|Account_No
A|AUP|1
A|AUP|2
A|AUP|3
B|BAC|4
B|BAC|5
B|BAC|6
C|NET|7
C|NET|8

D|DAC|9

53. I need to extract a part of the string from Flat File name as File validator indicate Source System. The indicate will be defined 4 integers followed with single string. I have highlighted samples.
abc_2016A_file.txt
xyz_2010Zaaa.txt
500P_testFile.txt

54. Weekly I will get 5 files, 1 file per day so total 5 files for week. Some time if Monday not received then Tuesday I will 2 files, if not Wednesday 3 files, if not Thursday 4 files else 5 files on Friday. I want a generic job will can run those no of instance based on files.

55. Design a generic job to load a table name abc_tbl in 5 schemas, based on the file name user will defined data load into which schema and at what time (Job should run on the used defined date and time).

56. 

57.

58.

59.

60.

2 comments: