Automating Loads
I believe that this is a new function within BW 3.0.. The DataSource now have an option "DataSource transfers double data records" on the "Processing" tab. Then there is a flag to "Ignore double data records". Not sure of the workaround for lesser versions of BW. Two Topics: Automating your InfoCube deletes, and using an ODS to track deltas. TOPIC 1
A) Go in to InfoPackage maintenance B) Click the Data Targets tab. C) For your selected data target, click the icon which looks like the Greek sigma, under the column labeled "Automatic loading of similar/identical requests in the InfoCube." You can also type DELE in the command field. A popup window entitled "Deleting Request from InfoCube after Update" will appear. D) On the popup window, you have the option of selecting "Always delete existing requests if delete conditions are found" and "Only for the same selection conditions." If you select these options, then BW will automatically delete (or reverse if you have aggregates) any prior request for the same selection criteria which has not been compressed. You can configure your InfoPackage in this manner, and then compress the request when you go to a new fiscal period and no that you will not load any more data for the prior fiscal period. TOPIC 2
You then would load the current period's full set of data into your ODS every day, and the built-in functionality of the ODS object would detect the differences and send only these on to your target cube. You can read more about this scenario in the white paper at service.sap.com/bw entitled "Data Staging Scenarios." Finally, using an ODS object as the data staging area eliminates (I think) your issues with the PSA. Instead of having your application read the PSA, have it read from an ODS object instead. Every ODS object has a unique key, so you won't get duplicate records as you can with a PSA. You can also report on ODS data in BEx queries, if you have any need to, which is yet another advantage this method has over the use of PSA. We use Cube 0COOM_CO1, which has a time characteristic of fiscal period. We manually load the current period into this cube daily. But to ensure that we do not have duplicate data in the cube we manually delete the previous day's request before loading the period again. There has to be an easier way to do this, any suggestions on how to automate this process? Also, the same with the PSA. We have an application reading the PSA, and to avoid duplicate records in the PSA, we are deleting the PSA load before loading it again the next day. Unfortunately with the PSA, the only way to delete a specific request that I can find is to go into the request and mark it as a status of NOT OK, then delete all requests with errors. You can automate the data deletion process from the InfoPackage. If you look under the "Data Targets" tab of the InfoPackage, there is a Checkbox to "delete the entire content of the data target". Setting this checkbox will ensure that the data is deleted before the new load. About the PSA, there is feature
by which data can be deleted from the PSA.
SAP BW Books:
Back to:
Return to :-
(c) www.gotothings.com All material on this site is Copyright.
|