Logical Process Modeling is the representation of a business process, detailing all the activities in the process from gathering the initial data to reaching the desired outcome. These are the process of modeling in a logical process model:
Data Collection:
Data collection is the process of gathering and measuring information on variables of interest, in an established systematic fashion that enables one to answer stated research questions, test hypotheses, and evaluate outcomes. The data collection component of research is common to all fields of study including physical and social sciences, humanities, business, etc. While methods vary by discipline, the emphasis on ensuring accurate and honest collection remains the same.
Controlling access to the data:
Access control is a security technique that regulates who or what can view or use resources in a computing environment. It is a fundamental concept in security that minimizes risk to the business or organization. To secure a facility, organizations use electronic access control systems that rely on user credentials, access card readers, auditing, and reports to track employee access to restricted business locations and proprietary areas, such as data centers. Some of these systems incorporate access control panels to restrict entry to rooms and buildings as well as alarms and lockdown capabilities to prevent unauthorized access or operations.
Task Analysis:
Determining which work task in the process should be accomplished next, It is a structured framework that dissects a job and arrives at a reliable method of describing it across time and people by composing a detailed listing of all the tasks. The first product of a task analysis is a task statement for each task on the list. Task or needs analysis should be performed whenever there are new processes or equipment, when job performance is below standards, or when requests for changes to current training or for new training are received. An analysis helps ensure that training is the appropriate solution, rather than another performance solution.
Delivering data:
Delivering the appropriate subset of the data to the corresponding work task helps to process the system in an easy way. Data delivery reports provide data processing metrics for a given job or topology. For example, you can use reports to view the number of records that were processed by a job or topology the previous day. Data processing metrics are also available when you monitor a job or topology. For example, the detail pane of a topology provides a single view into the record count and throughput for all running pipelines in the topology. While those monitoring metrics display on-demand, data delivery reports provide reporting on those metrics for specified windows of time that you can easily reference at a later date.
Data Assurance:
Data Assurance (DA) checks for and corrects errors that might occur as data is communicated between a host and a storage array. DA capabilities are presented at the pool and volume group level in System Manager. The Data Assurance (DA) feature increases data integrity across the entire storage system. It refers to assuring that all necessary data exists and all required actions have been performed at each task. It increases data integrity across the entire storage system. DA enables the storage array to check for errors that might occur when data is moved between the hosts and the drives.
Acceptance of results:
Providing a mechanism to indicate acceptance of the results of the process. It must be an absolute and unqualified acceptance of all the terms of the offer. If there is any variation, even on an unimportant point, between the terms of the acceptance, there is no contract. The mechanism is a device that transforms input forces and movement into a desired set of output forces and movement.