|
1. Differences b/w 3.0 and 3.5
2. Differences b/w 3.5 and BI 7.0 3. Can you explain a life cycle in brief 4. Difference b/w table & structure 5. Steps of LO 6. Steps of LIS 7. Steps of generic 8. What is index and how do you increase performance using them 9. How do you load deltas into ODS and cube 10. Example of errors while loading data and how do u resolve them 11. How do you maintain work history until a ticket is closed 12. What is reconciliation 13. What is the methodology u use before implementation 14. What are the role & responsibilities when you are in implementation and while in support also Major Differences between Sap Bw 3.5 & sapBI 7.0 version 1. In Infosets now you can include Infocubes as well. 2. The Remodeling transaction helps you add new key figure and characteristics and handles historical data as well without much hassle. This is only for info cube. 3. The BI accelerator (for now only for infocubes) helps in reducing query run time by almost a factor of 10 - 100. This BI accl is a separate box and would cost more. Vendors for these would be HP or IBM. 4. The monitoring has been imprvoed with a new portal based cockpit. Which means you would need to have an EP guy in ur project for implementing the portal ! :) 5. Search functionality hass improved!! You can search any object. Not like 3.5 6. Transformations are in and routines are passe! Yess, you can always revert to the old transactions too. 7. The Data Warehousing Workbench replaces the Administrator Workbench. 8. Functional enhancements have been made for the DataStore
object:
9. The transformation replaces the transfer and update rules. 10. New authorization objects have been added 11. Remodeling of InfoProviders supports you in Information
12 The Data Source:
13.There are functional changes to the Persistent Staging Area (PSA). 14.BI supports real-time data acquisition. 15 SAP BW is now known formally as BI (part of NetWeaver
2004s). It implements the Enterprise Data Warehousing (EDW). The new features/
Major differences include:
16. Load through PSA has become a mandatory. You can't skip this, and also there is no IDoc transfer method in BI 7.0. DTP (Data Transfer Process) replaced the Transfer and Update rules. Also in the Transformation now we can do "Start Routine, Expert Routine and End Routine". during data load. New features in BI 7 compared to earlier versions: i. New data flow capabilities such as Data Transfer Process (DTP), Real time data Acquisition (RDA). ii. Enhanced and Graphical transformation capabilities such as Drag and Relate options. iii. One level of Transformation. This replaces the Transfer Rules and Update Rules iv. Performance optimization includes new BI Accelerator feature. v. User management (includes new concept for analysis authorizations) for more flexible BI end user authorizations. =====================================
==============================
1.Project peparation.
Normally in the first phase all the management guys will sit with in a discussion . In the second phase u will get a functional spec n basing on that u will have a technical spec. In the third phase u will actually inplement the project and finally u after testing u will deploy it for production i.e. Go-live . You might fall under and get involved in the realization phase . If its a supporting project u will come into picture only after successful deployment. ===========================
We need to transfer them from the BCT and then activate them from D version to the active Verion A . Go to the tcode -- RSA5 ---- Transfer the desired datasource . and then LBWE to maintain the extract structure LBWG -- to delete the setup tables ---- why ... ? OLI*bw -- Statistcal initialization . LBWQ --- To delete the extractor queue SM13 -- to delete the update queue . LBWf -- to get the log . Its always recommended to have the Queued delta as an update method , since it uses the collective run to schedule all the changed records . Once the intial run is over set the delta method Queued to periodic for further delta load Why ..? We need to delete the setup tables since we need to delete the data that is already in them and also because we will change the ES by adding the required fields from the R/3 communuication structrure . we can also select and hide the fields here .. all the filds in blue are mandatory .. If the required fields are not avilable we will go for the DS enhancments . =================================
The range is 000 to 999. 0-499 is for SAP structures n 500 to 999 are for customer defined . We need to consider the two cases if its the SAP defined r the customer defined . Lets see the SAP defined first : Tcode -- LBW0
You select the option the setup environment settings it will create the two tables n one structure with the naming convention .. 2LIS_application no _BIW1 ,2LIS_appl No _BIW2 and 2LIS _appl no_BIWS These two structures are used to enable the delta records interchangebaly --- we will come to know this in the table TMCBIW . Then go to LBW1 to change the version .. LBW2 to setup the no update update method .. now you do the full load and after that go LBW1 change the version and then go to LBW2 to setup the delta update n the periodic job . Now you can load the delta updates . If its in the case of the user defined : You need to create the IS MC18 --- create the IS
Then Tcode LBW0 give the IS name and
You can use OMO1 to fill the tables n u do the full upload then u need to setup the delta by sel the steup delta .. You can set whether the delta is enabled are not using the option activate /deactivate deelta Both in cases you need while migrating the data you need to lock the setup tables to prevent users from entreing the transactions SE14 ... and after completion u need to unlock . But LO is preffered to LIS in all aspects .. LO provdes the IS upto the level of detail . Enhanced performance and also deletion of setup tables after updation which we never do in LIS . =========================
Steps :
HIDE is used to hide the fields .. it will not transfer the data from r/3 to BW SELECT -- the fields are available in the selection screen of the info package while u schedule it . INVERSION is for key figs which will operate with '-1' and nullify the value . Once the datasource is generated you can extract the data using it . And now to speak abt the delta ... we have .. 0calday , Numeric pointer , time stamp. 0calday -- is to be run only once a day that to at the
end of the day and with a range of 5 min. Numeric Pointer -- is to be used
for the tables where it allows only appending of records ana no change
.. eg: CATSDB HRtime managent table .
Whenever there 1:1 relation you use the view and 1:m you use FM. ============================
Just common example if you can obeserve any book which has indexes .. This indexes will give a exact location of each topic based on this v can easily go to that particulur page and we will continue our thingz. Similar mannner in the data level the indexes are act in BW side... =============================
2. Data error in PSA.
3. RFC connection failed.
4. Short dump error.
a) Loads can be failed due to the invalid characters
There are two types of tickets. * ito tickets - which are usually generated by the system automatically when a process fails.for example, when a process chain fails toi run it wil generate an ito ticket which we need to address and find the fault. * non-ito tickets - which are the issues which the client face and which are forwarded for correction or alternative action. If you're using Remedy for tickets, Unfortunately it's not possible. But this depends on the software you are using, ask your admin. ===========================
In general this process is taken @ 3 places one is comparing the info provider data with R/3 data,Compare the Query display data with R/3 or ODS data and Checking the data available in info provider kefigure
with PSA key figure values
2. Blueprint, in which the business processes are defined and the business blueprint document is designed; 3. Realization, in which the system is configured, knowledge transfer occurs, extensive unit testing is completed, and data mappings and data requirements for migration are defined; 4. Final Preparation, in which final integration testing, stress testing, and conversion testing are conducted, and all end users are trained; and 5. Go-Live and Support, in which the data is migrated
from the legacy systems, the new system is activated, and post-implementation
support is provided
First and foremost will be your requirements gathering from the client. Depending upon the requirements you will creat a business blueprint of the project which is the entire process from the start to the end of an implementation... After the blue print phase sign off we start off with the realization phase where the actual development happens... In our example after installing the necessary softwares, patches for BI we need to discuss with the end users who are going to use the system for inputs like how they want a report to look like and what are the Key Performance Indicators(KPI) for the reports etc., basically its a question and answer session with the business users... After collecting those informations the development happens in the development servers... After the development comes to an end the same objects are tested in quality servers for any bugs, errors etc., When all the tests are done we move all the objects to the production environment and test it again whether everything works fine... The Go-Live of the project happens where the actually postings happen from the users and reports are generated based on those inputs which will be available as an analytical report for the management to take decisions... The responsibilites vary depending on the requirement... Initially the business analyst will interact with the end users/managers etc., then on the requirements the software consultants do the development, testers do the testing and finally the go-live happens... What are the objects that we peform in a production Support project? In production Suport Generally most of the project they will work on monitoring area for their loads(R3/ NON SAP to Data Taggets (BW)) and depending up the project to project it varies because some of them using the PC's and Some of them using Event Chains. So its Depends up on the Project to project varies. What are the different transactions that we use frequently in Production support project? Plz explain them in detial.. Generally In Production Support Project , we will use the check the loads by using RSMO for Monitoring the loads and we will rectify the errors in that by using step by step analysis. The consultant is required to have access to the following transactions in R3. 1. ST22
Authorizations for the following transactions are required in BW 1. RSA1
The Process Chain Maintenance (transaction RSPC) is used to define, change and view process chains. Upload Monitor (transaction RSMO or RSRQ (if the request is known) The Workload Monitor (transaction ST03) shows important overall key performance indicators (KPIs) for the system performance The OS Monitor (transaction ST06) gives you an overview on the current CPU, memory, I/O and network load on an application server instance. The database monitor (transaction ST04) checks important performance indicators in the database, such as database size, database buffer quality and database indices. The SQL trace (transaction ST05) records all activities on the database and enables you to check long runtimes on a DB table or several similar accesses to the same data. The ABAP runtime analysis (transaction SE30) The Cache Monitor (accessible with transaction RSRCACHE or from RSRT) shows among other things the cache size and the currently cached queries. The Export/Import Shared buffer determines the cache size; it should be at least 40MB. *-- Anu radha |
|
BW Questions
SAP BW Books:
Back to:
Return to :-
(c) www.gotothings.com All material on this site is Copyright.
|