Menü aufrufen
Toggle preferences menu
Persönliches Menü aufrufen
Nicht angemeldet
Ihre IP-Adresse wird öffentlich sichtbar sein, wenn Sie Änderungen vornehmen.

Schwachstellen/en: Unterschied zwischen den Versionen

Aus HITGuard User Guide
Faha (Diskussion | Beiträge)
Keine Bearbeitungszusammenfassung
Isan (Diskussion | Beiträge)
Keine Bearbeitungszusammenfassung
 
(108 dazwischenliegende Versionen von 3 Benutzern werden nicht angezeigt)
Zeile 1: Zeile 1:
<b>What is a review?</b>
<b>What is a review?</b>


In HITGuard, a review is understood to be the recording of deviations from a target state. For example, a review can be an audit by an external auditor. The findings that the auditor may have handed over to you in the form of a report can be entered in HITGuard as a so-called "test result".  
In HITGuard, a review is understood to be the recording of deviations from a target state. For example, a review can be an audit by an external auditor. The findings that the auditor may have handed over to you in the form of a report can be entered in HITGuard as a so-called "review result".  


Inspection results can also arise from a check with HITGuard. This is done by using knowledge databases in the "deviation analyses". Here, a check is guided by structured questionnaires, with the help of which deviations from the desired target state are determined.
Review results can also arise from a review with HITGuard. This is done by using knowledge bases in the "gap analyses". Here, a review is guided by structured questionnaires, with the help of which deviations from the desired target state are determined.


The target state is referred to as the target maturity level in HITGuard and can be set separately for each management system. Only experts or administrators can set and change the target maturity level under [[Special:MyLanguage/Managementsysteme#Aktiver_Analysezeitraum|"Administration → Management Systems"]].
The target state is referred to as the target score level in HITGuard and can be set separately for each management system. Only experts or administrators can set and change the target score level under [[Special:MyLanguage/Managementsysteme#Aktiver_Analysezeitraum|"Administration → Management Systems"]].


==<span id="create_verification"></span> Verifications (deviation analyses / test results) ==
<span id="Überprüfungen_(Abweichungsanalysen/Prüfergebnisse)"></span>
==<span id="create_überprüfung"></span> Reviews (gap analyses/review results) ==


Under "Risk management → Vulnerabilities → <u>Reviews</u> | Test objects | Deviations | Need for clarification", professionals and experts can find all reviews that have been created in the management system. All reviews are displayed, regardless of whether they are completed, in progress, or in draft status. New reviews can also be created or requested here. Furthermore, the reviews can also be downloaded as PDF files.
Under "Risk management → Vulnerabilities → <u>Reviews</u> | Review objects | Gaps | Clarification needed", professionals and experts can find all reviews that have been created in the management system. All reviews are displayed, regardless of whether they are completed, in progress, or in draft status. New reviews can also be created or requested here. Furthermore, the reviews can also be downloaded as PDF or Word files.


[[Datei:Risikoidentifikation Überprüfungen.png|left|thumb|901px|Overview of the reviews]]
[[Datei:Risikoidentifikation Überprüfungen.png|left|thumb|900px|Overview of the reviews]]
<br clear=all>
<br clear=all>


=== Create / edit review ===
<span id="Überprüfung_erstellen/bearbeiten"></span>
=== Create/edit review ===


<b>Audit result:</b>
<b>Review result:</b>
* An audit result means, for example, the findings that were handed out by the auditor in the course of an audit, possibly in the form of a report.  
* A review result means, for example, the findings that were handed out by the auditor in the course of an audit, possibly in the form of a report.  
* These findings can be entered via "Audit result +".
* These findings can be entered using the "Add review result" button, accessible via the dropdown in the "Plus" button.


<b>Deviation analysis:</b>
<b>Gap analysis:</b>
* Deviation analyses are questionnaire-based reviews (WDB) on specific topics. These questionnaires can be used, for example, to determine the degree of compliance with a standard. In addition to the questionnaire topics, other review results can be recorded.
* Gap analyses are questionnaire-based reviews (KB) on specific topics. These questionnaires can be used, for example, to determine the degree of compliance with a standard. In addition to the questionnaire topics, other review results can be recorded.
* If a translation of the WDB is available in the currently selected language (flag on the top right, next to the logout button), it will be applied.
* If a translation of the KB is available in the currently selected language (flag on the top right, next to the "Logout" button), it will be applied.
* To create a deviation analysis, click on the "Deviation Analysis +" button.
* To create a gap analysis, click on the "Plus" button.


In terms of procedure, the only difference between the two inspection options is that an inspection result cannot handle inspection objects based on knowledge bases.
In terms of procedure, the only difference between the two inspection options is that an inspection result cannot handle inspection objects based on knowledge bases.
Zeile 29: Zeile 31:
To edit a review, double-click on it in the overview.
To edit a review, double-click on it in the overview.


For more information on creating or editing a review, whether deviation analysis or review result, see <b>[[Special:MyLanguage/Überprüfung| Create/Edit Review]]</b>.
For more information on creating or editing a review, whether gap analysis or review result, see <b>[[Special:MyLanguage/Überprüfung| Create/Edit review]]</b>.
 
===Copy review===
====Create duplicate====
A copy of a review can be created at the click of a button. For this, the structure of the review objects is taken from the original, but not the answers or any linked elements (measures, controls,...). Copies are created with "- Copy" in the name in the state Draft.
====Create reassessment====
A reassessment of a review can be created at the click of a button. For this, not only the structure of the review objects is taken from the original, but you can decide whether you want to also adopt the answers and justifications. Reassessments are created with "- Revaluation" in the name in the state Draft. The original must be in the state Closed for this to work.


=== <span id="asses_wiz_nav"></span>Navigation in the wizard ===  
=== <span id="asses_wiz_nav"></span>Navigation in the wizard ===  
The following section explains how the navigation in the review wizard works.
The following section explains how the navigation in the review wizard works.


[[Datei:Wizard Navigation.png|left|thumb|900px|Review Wizard]]
[[Datei:Wizard Navigation.png|left|thumb|900px|Review wizard]]
<br clear=all>
<br clear=all>


The navigation in the wizard for performing checks works as follows:
The navigation in the wizard for performing checks works as follows:
* Clicking on "Next" takes you to the next step or to the next check question.
* Clicking on "Next" takes you to the next step or to the next review question.


* Clicking on "Back" takes you to the last step or to the last test question.
* Clicking on "Back" takes you to the previous step or to the previous review question.


* Clicking in the navigation tree on the left side will take you to the desired location.
* Clicking in the navigation tree on the left side will take you to the desired location.


* At the bottom left of the navigation mask, the test questions can be displayed in the navigation tree via a "Test questions" checkbox.
* At the bottom left of the navigation mask, the review questions can be displayed in the navigation tree via a "Review questions" checkbox. The wizard remembers whether the checkbox was selected or deselected and maintains the desired behavior.


* If you show the test questions, you can also navigate to them via the tree.
* If you show the review questions, you can also navigate to them via the tree. In the same way, you are taken back to the review question/result if you have left the review by creating a measure or control. Generally, the left part always shows the review questions/results that are currently visible on the right.


* "Save" and "Close" behave self-explanatory.
* "Save" and "Close" behave self-explanatorily.


Regardless of the type of review you perform (variance analysis using a knowledge base or recording review results) the processing steps in the Perform Review Wizard are essentially the same:
Regardless of the type of review you perform (gap analysis using a knowledge base or recording review results) the processing steps in the perform review wizard are essentially the same:
# Create and save review
# Create and save review
# Add topics or test objects and activate review
# Add topics or review objects and activate review
# Answering the test objects or the possibility to request an answer in the "Self-Assessment" by the interview partner
# Answering the review objects or the possibility to request an answer in the "self assessment" by the interview partner
# Check responses or identified deviations
# Check responses or identified gaps
# Complete the review
# Complete the review


A review can be performed by an expert. However, he also has the option of requesting the answer to the review from his interview partner in HITguard. Via workflow support, the interview partner receives a request for this and can complete the response and then return it to the expert. The expert checks the results and can mark the review as completed and archive it. The handling of deviations from the review is possible at any time, even if a review has already been completed.
A review can be performed by an expert or professional. However, he also has the option of requesting the answer to the review from his interview partner in HITguard. Via workflow support, the interview partner receives a request for this and can complete the response and then return it to the expert. The expert checks the results and can mark the review as completed and archive it. The handling of deviations from the review is possible at any time, even if a review has already been completed. Measures and controls can also be linked at any time.


=== <span id="Status"></span>Status and deletion of a check===  
<span id="Status_und_Löschen_einer_Überprüfung"></span>
=== <span id="Status"></span>Status and deletion of a review===  


A review can be in different statuses. If the email notifications are active in the management system, all persons relevant in the workflow are prompted to perform their tasks when the status changes. This would be, for example, the interviewee if a reviewer requests a response or the reviewer himself if he returns the response.
[[Datei:Überprüfung Stati wechseln.PNG|right|thumb|900px]]
 
A review can have different states. If the e-mail notifications are active in the management system, all persons relevant in the workflow are prompted to perform their tasks when the status changes. This would be, for example, the interview partner if an auditor requests a response, or the auditor themselves if the response is returned.


The status of the review can be changed via the blue button in the upper right corner.
The status of the review can be changed via the blue button in the upper right corner.
[[Datei:Überprüfung Stati wechseln.PNG|left|thumb|900px]]
<br clear=all>


<b>Draft</b>
<b>Draft</b>
Zeile 74: Zeile 82:


<b>In progress</b>
<b>In progress</b>
* If the review is activated, it will be set to "In Progress" status. As a result, the interview partner and reviewer will see it under "My Tasks."
* If the review is activated, it will be set to "In Progress" status. As a result, the interview partner and auditor will see it under "My tasks."
* Now it is time for the main reviewer to perform the review or request a response by "Request Response" from interview partners (only for Self-Assessment type).  
* Now it is time for the lead auditor to perform the review or request a response by "Request Response" from interview partners (only for self assessments).  
* It can be set back to "Draft" status by "Deactivate Review".
* It can be set back to the status "Draft" by selecting "Deactivate review".
* It can be set to "Closed" status by "Close Review".
* It can be set to the status "Closed" by selecting "Close review".


<b>Requested (only for Self-Assessments type) </b>.
<b>Requested (only for self assessments) </b>.
* If the review is requested by the principal investigator, it will be set to "Requested" status. The interviewee will be prompted to perform the review via an email.
* If the review is requested by the lead auditor, it will be set to "Requested" status. The interview partner will be prompted via e-mail to perform the review.
* The interviewee can set the status to "Answered" by clicking on "Submit Review" after the review has been conducted.
* The interview partner can set the status to "Answered" by clicking on "Submit review" after the review has been conducted.
* Requested self assessments are marked with a badge.


<b>Answered (only for Self-Assessment type) </b>.
<b>Answered (only for self assessmentse) </b>.
* If the review is returned by the interviewee with "Submit Review", it will be set to the status "Answered". The reviewers will be prompted by an email to check the response.
* If the review is returned by the interview partner with "Submit review", it will be set to the status "Answered". The auditors will be prompted by an e-mail to check the response.
* It can be returned to the "Requested" status by "Request Response". The interviewee must revise their response.
* Answered self assessments are marked with a badge.
* It can be put back into "Draft" status by "Deactivate Review" (Only reviewers will be notified).
* It can be returned to the status "Requested" by selecting "Request response" again. The interview partner must then revise their response.
* It can be moved to "Closed" status by "Close Review".  
* It can be put back into the status "Draft" by selecting "Deactivate review" (only auditors will be notified).
* It can be moved to the status "Closed" by selecting "Close review".  


<b>Closed</b>
<b>Closed</b>
* If the review is set to the "Closed" status by "Close review", it is read-only and it can no longer be edited.  
* If the review is set to the "Closed" status by "Close review", it is read-only and it can no longer be edited.
* <u>Caution</u>: A self assessment can only be closed if there is at least one interview partner.
* <u>Exception</u>: Even in already closed reviews, measures and controls can still be added to or removed from review questions.


<b>Delete a review</b>.
<b>Delete a review</b>.
* By "Delete review" you can delete reviews that are still <b>not</b> completed.
* With "Delete review" you can delete reviews that are <b>not</b> completed yet.
* Attention: By deleting, the check objects created by this check as well as deviations already assigned to risks will also be deleted!
* Caution: By deleting, the review objects created in this review as well as gaps already assigned to risks will also be deleted!


==== Change review type (interview <=> self-assessment) ====
<span id="Überprüfungstyp_wechseln_(Interview_Self_Assessment)"></span>
==== Change review type (interview <=> self assessment) ====


The type of review can be changed only in the "Draft" status.  
The type of review can be changed only in the "Draft" status. If the type is changed to "Self assessment", the end date changes to the reply deadline.


If the wrong type was set and the check was activated, the check must first be reset to the "Draft" status by "Deactivate check".
If the wrong type was set and the check was activated, the check must first be reset to the "Draft" status by "Deactivate check".


== <span id="Test objects"></span>Test objects ==
====Tips, tricks & best practice====
[[Datei:BESTPRACTICE.png|left|thumb|100px]]
*This type of analysis is a powerful tool in HITGuard. It is a central component of risk identification and treatment. Detected gaps can be linked with reduction measures and/or monitoring controls directly within the analysis.
*The crucial benefit of doing this in one step is that any gaps, measures, and controls can be assigned to the identified risk. If the review object is also linked with a structural element, such as an application, this information is also comprehensibly shown in the details of that element.
*Revaluating instead of evaluating again. From time to time, generally at regular intervals, the status quo should be ascertained again. For this, HITGuard offers the revaluation of analyses, which allows the updating of previous analyses instead of having to perform a completely new analysis. Previous answers can be viewed and even carried over. This makes the development of a review object even more apparent.<br clear=all>


Under "Risk Management → Vulnerabilities → Audits | <u>Audit Objects</u> | Deviations | Need for Clarification", you will find all the audit objects that were created in the course of audits in the current management system.
<span id="Prüfobjekte"></span>
== <span id="Prüfobjekte"></span>Review objects ==


[[Datei:Prüfobjekte Übersicht.png|left|thumb|900px|Overview of the test objects]]
Under "Risk management → Vulnerabilities → Review| <u>Objects of review</u> | Gaps | Clarification needed", you will find all the review objects that were created in the course of reviews in the current management system.
 
[[Datei:Prüfobjekte Übersicht.png|left|thumb|900px|Overview of the review objects]]
<br clear=all>
<br clear=all>


Clicking on a test object opens the detailed view.  
Clicking on a review object opens the detailed view.  


[[Datei:Prüfobjekt bearbeiten.png|left|thumb|901px|Edit test object]]
[[Datei:Prüfobjekt bearbeiten.png|left|thumb|901px|Edit review object]]
<br clear=all>
<br clear=all>


Here you can see how the test object was answered. Likewise, if several versions of the test object are available, you can view how the assessment of the test object has developed from one version to the next. Only the header data of a test object can be edited via this mask. This means that this mask cannot be used to answer a test object.
Here you can see how the review object was answered. Likewise, if several versions of the review object are available, you can view how the assessment of the review object has developed from one version to the next. Only the header data of a review object can be edited via this mask. This means that this mask cannot be used to answer a review object.


=== Initiate partial automatic re-evaluation ===
<span id="Teil-Automatische_Neubewertung_initiieren"></span>
=== Initiate semi-automatic revaluation ===


Due to the implementation of measures, it can happen that test objects are proposed for partial automatic re-evaluation. This always happens if the measure was either created in the course of a check for a test object or linked to a test object, the "after" value of the vulnerability reduction was set and the measure is implemented. If a measure is implemented, the linked test objects are marked with "Re-evaluation recommended".
Due to the implementation of measures, it can happen that review objects are proposed for semi-automatic revaluation. This always happens if the measure was either created in the course of a check for a review object or linked to a review object, the "after" value of the vulnerability reduction was set, and the measure is implemented. If a measure is implemented, the linked review objects are marked with "Revaluation recommended".


To avoid having to perform a new check every time a measure is implemented, HITGuard offers the option of subjecting these marked test objects to a semi-automatic re-evaluation. This means that HITGuard automatically updates the deviation of the respective test questions of the test objects. In this process, the test questions that are affected by the implementation of measures are set to the "after" value of the vulnerability reduction.
To avoid having to perform a new review every time a measure is implemented, HITGuard offers the option of subjecting these marked review objects to a semi-automatic revaluation. This means that HITGuard automatically updates the gap of the respective review questions of the review objects. A separate review is created for each individual organizational unit. In this process, the review questions that are affected by the implementation of measures are set to the "after" value of the vulnerability reduction.  


Execution:
Execution:
# Select test object.
# Select review object.
# Click the orange arrow "Initiate partial automatic re-evaluation".
# Click the orange arrow "Initiate semi-automatic revaluation".
# Select the deviations to be updated.
# Select the gaps to be updated.
# Click the orange arrow "Perform re-evaluation for selected deviations".
# Click the orange arrow "Perform revaluation for selected gaps".
 
[[Datei:RNDezember2019 4.png|left|thumb|901px|Semi-automatic revaluation]]
<br clear=all>
 
===Tips, tricks & best practice===
[[Datei:BESTPRACTICE.png|left|thumb|100px]]
Review objects can be evaluated multiple times within one analysis. For example, BSI's IT Grundschutz offers a module "Web applications". If various web applications are operated in an organization, it is recommended to answer the module for each one of them. This means the module should be selected multiple times across one or more reviews and linked to the respective resources. Additional tip: Give your review objects meaningful and telling names, such as "Web application 123_Cloud". This allows you to simply search the review objects when doing a revaluation.<br clear=all>


[[Datei:RNDezember2019 4.png|left|thumb|901px|Partial automatic revaluation]]
<br clear=all>
<br clear=all>


== <span id="Deviations"></span>Deviations ==
<span id="Abweichungen"></span>
== <span id="Deviations"></span>Gaps==


Under "Risk Management → Vulnerabilities → Audits | Audit Objects | <u>Deviations</u> | Need for Clarification", you will find all deviations that were identified during the performance of audits.
Under "Risk management → Vulnerabilities → Reviews | Objects of review | <u>Gaps</u> | Clarification needed", you will find all gaps that were identified during the performance of reviews.


[[Datei:Abweichungen Übersicht.png|left|thumb|900px|Overview of deviations]]
[[Datei:Abweichungen Übersicht.png|left|thumb|900px|Overview of gaps]]
<br clear=all>
<br clear=all>


The columns "Measure missing", "Target value missing", "Target value too low" can be used to find out against which deviations nothing or too little has been done. These deviations are tagged in the grid. If a deviation does not have a tag, this means that attempts are being made to correct the deviation.
The columns "Measure missing", "Target value missing", "Target value too low" can be used to find out against which gaps nothing or too little has been done. These gaps are tagged in the grid. If a gap does not have a tag, this means that attempts are being made to correct the gap.
 
Here you have the option to assign gaps that have not yet been assigned to a risk.
 
Double-clicking on a gap opens the review at the point where the gap was detected. Here, measures and controls for the gap can now be defined. For more information, see [[Special:MyLanguage/Prüffragen_beantworten| Answer review questions]].
 
Optionally, it is possible to display a column that shows whether the line is a review question (from a knowledge base) or a review result (freely entered). This allows experts to then expand their self-developed knowledge bases by review results that are often added to reviews during the interview.
 
<span id="Abweichungen_filtern"></span>
===Filter gaps===


Here you have the option to assign deviations that have not yet been assigned to a hazard layer.
[[Datei:Abweichungsfilter.png|right|thumb|900px|Abweichungsfilter]]


Double-clicking on a deviation opens the check at the point where the deviation was detected. Here, measures and controls for the deviation can now be defined. For more information, see [[Special:MyLanguage/Abweichungsanalyse#Prüffragen| Answer review questions]].
With the filter, it can be selected which type of gaps is displayed:
*negative: review questions/results that were evaluated < the target score
*none: review questions/results that were evaluated = the target score
*positive: review questions/results that were evaluated > the target score


=== <span id="Target maturity weighting"></span>Target maturity weighting===
<span id="Target_Score_Gewichtung"></span>
=== <span id="Target score weighting"></span><span id="Zielreifegrad-Gewichtung"></span>Target score weighting===


What the target maturity level is and where it is set can be found under [[Special:MyLanguage/Managementsysteme#Aktiver Analysezeitraum | Management systems]].
What the target score level is and where it is set can be found under [[Special:MyLanguage/Managementsysteme#Aktiver Analysezeitraum | Management systems]].
Wherever deviations occur, there is an additional form of sorting: the target maturity weighting. This is possible, for example, under "Risk Management → Vulnerabilities → Deviations".
Wherever gaps occur, there is an additional form of sorting: the target score weighting. This is possible, for example, under "Risk management → Vulnerabilities → Gaps".


If activated, the sorting of protection goals is based on the target maturity weighting. The greater the deviation from the target maturity level and the greater the weighting of the protection target, the greater the target maturity weighting: target maturity weighting = deviation level * weighting of the protection target.
If activated, the sorting of protection targets is based on the target score weighting. The greater the deviation from the target score level and the greater the weighting of the protection target, the greater the target score weighting: target score weighting = deviation level * weighting of the protection target.


Note: A response of "No" corresponds to maturity level 1, "Partially" corresponds to maturity level 3.
Note: A response of "No" corresponds to score level 1, "Partially" corresponds to score level 3.


Examples for illustration: Protection goal weighting: Mean (3).
Examples for illustration: Protection goal weighting: Mean (3).
*Maturity of deviation = 2, Target maturity = 4 =&gt; Degree of deviation = 2, Target maturity weighting = 2 * 3 = 6.
*score of deviation = 2, target score = 4 =&gt; Degree of deviation = 2, target score weighting = 2 * 3 = 6.
*Maturity of deviation = 4, target maturity = 4 =&gt; degree of deviation = 0, target maturity weighting = 0 * 3 = 0.
*score of deviation = 4, target score = 4 =&gt; degree of deviation = 0, target score weighting = 0 * 3 = 0.


[[Datei:Zielreifegrad Gewichtung anwenden.gif|left|thumb|900px|Apply target maturity weighting]]<br clear=all>
[[Datei:Zielreifegrad Gewichtung anwenden.gif|left|thumb|900px|Apply target score weighting]]<br clear=all>


== <span id="Need for clarification"></span>Need for clarification ==
<span id="Abklärungsbedarf"></span>
== <span id="Need for clarification"></span>Clarification needed ==


Under "Risk Management → Vulnerabilities → Audits | Audit Objects | Deviations | <u>Need for Clarification</u>" you will find all audit questions / audit results that were marked with "Need for Clarification" in the course of an audit.
Under "Risk Management → Vulnerabilities → Reviews | Objects of review | Gaps| <u> Clarification needed</u>", you will find all review questions/review results that were marked with "Clarification needed" in the course of a review.


[[Datei:Abklärungsbedarf Übersicht.png|left|thumb|900px|Overview of test questions requiring clarification / test results]]
[[Datei:Abklärungsbedarf Übersicht.png|left|thumb|900px|Overview of review questions/review results requiring clarification]]
<br clear=all>
<br clear=all>


This marking is necessary in practice if you cannot yet clarify how the question is to be answered when answering a test question. This can happen if, for example, you would need to consult another person or otherwise research the information. Following a series of checks, the system evaluates which questions still need to be researched. This is exactly what the "Need for clarification" view is for.
This label is necessary in practice if you cannot yet clarify how the question is to be answered when answering a review question. This can happen if, for example, you would need to consult another person or otherwise research the information. Following a series of reviews, the system evaluates which questions still need to be researched. This is exactly what the "Clarification needed" view is for.


If you click on a test question/result, you will be redirected to it.
If you click on a review question/result, you will be redirected to it.


It is also possible to export a list of all test questions/results requiring clarification via the Export button (next to the search bar). This provides an easy-to-use list of the test questions that require clarification.
It is also possible to export a list of all review questions/results requiring clarification via the "Export" button (next to the search bar). This provides an easy-to-use list of the review questions that require clarification.

Aktuelle Version vom 22. August 2025, 09:47 Uhr

What is a review?

In HITGuard, a review is understood to be the recording of deviations from a target state. For example, a review can be an audit by an external auditor. The findings that the auditor may have handed over to you in the form of a report can be entered in HITGuard as a so-called "review result".

Review results can also arise from a review with HITGuard. This is done by using knowledge bases in the "gap analyses". Here, a review is guided by structured questionnaires, with the help of which deviations from the desired target state are determined.

The target state is referred to as the target score level in HITGuard and can be set separately for each management system. Only experts or administrators can set and change the target score level under "Administration → Management Systems".

Reviews (gap analyses/review results)

Under "Risk management → Vulnerabilities → Reviews | Review objects | Gaps | Clarification needed", professionals and experts can find all reviews that have been created in the management system. All reviews are displayed, regardless of whether they are completed, in progress, or in draft status. New reviews can also be created or requested here. Furthermore, the reviews can also be downloaded as PDF or Word files.

Overview of the reviews


Create/edit review

Review result:

  • A review result means, for example, the findings that were handed out by the auditor in the course of an audit, possibly in the form of a report.
  • These findings can be entered using the "Add review result" button, accessible via the dropdown in the "Plus" button.

Gap analysis:

  • Gap analyses are questionnaire-based reviews (KB) on specific topics. These questionnaires can be used, for example, to determine the degree of compliance with a standard. In addition to the questionnaire topics, other review results can be recorded.
  • If a translation of the KB is available in the currently selected language (flag on the top right, next to the "Logout" button), it will be applied.
  • To create a gap analysis, click on the "Plus" button.

In terms of procedure, the only difference between the two inspection options is that an inspection result cannot handle inspection objects based on knowledge bases.

To edit a review, double-click on it in the overview.

For more information on creating or editing a review, whether gap analysis or review result, see Create/Edit review.

Copy review

Create duplicate

A copy of a review can be created at the click of a button. For this, the structure of the review objects is taken from the original, but not the answers or any linked elements (measures, controls,...). Copies are created with "- Copy" in the name in the state Draft.

Create reassessment

A reassessment of a review can be created at the click of a button. For this, not only the structure of the review objects is taken from the original, but you can decide whether you want to also adopt the answers and justifications. Reassessments are created with "- Revaluation" in the name in the state Draft. The original must be in the state Closed for this to work.

The following section explains how the navigation in the review wizard works.

Review wizard


The navigation in the wizard for performing checks works as follows:

  • Clicking on "Next" takes you to the next step or to the next review question.
  • Clicking on "Back" takes you to the previous step or to the previous review question.
  • Clicking in the navigation tree on the left side will take you to the desired location.
  • At the bottom left of the navigation mask, the review questions can be displayed in the navigation tree via a "Review questions" checkbox. The wizard remembers whether the checkbox was selected or deselected and maintains the desired behavior.
  • If you show the review questions, you can also navigate to them via the tree. In the same way, you are taken back to the review question/result if you have left the review by creating a measure or control. Generally, the left part always shows the review questions/results that are currently visible on the right.
  • "Save" and "Close" behave self-explanatorily.

Regardless of the type of review you perform (gap analysis using a knowledge base or recording review results) the processing steps in the perform review wizard are essentially the same:

  1. Create and save review
  2. Add topics or review objects and activate review
  3. Answering the review objects or the possibility to request an answer in the "self assessment" by the interview partner
  4. Check responses or identified gaps
  5. Complete the review

A review can be performed by an expert or professional. However, he also has the option of requesting the answer to the review from his interview partner in HITguard. Via workflow support, the interview partner receives a request for this and can complete the response and then return it to the expert. The expert checks the results and can mark the review as completed and archive it. The handling of deviations from the review is possible at any time, even if a review has already been completed. Measures and controls can also be linked at any time.

Status and deletion of a review

A review can have different states. If the e-mail notifications are active in the management system, all persons relevant in the workflow are prompted to perform their tasks when the status changes. This would be, for example, the interview partner if an auditor requests a response, or the auditor themselves if the response is returned.

The status of the review can be changed via the blue button in the upper right corner.

Draft

  • When the review is saved for the first time or deactivated from the "In Progress" status, it is in the "Draft" status.
  • "Draft" means that the review is not yet active and no one has been informed about the review by the system.
  • From this status, the review can be activated, i.e. set to the "In Progress" status.

In progress

  • If the review is activated, it will be set to "In Progress" status. As a result, the interview partner and auditor will see it under "My tasks."
  • Now it is time for the lead auditor to perform the review or request a response by "Request Response" from interview partners (only for self assessments).
  • It can be set back to the status "Draft" by selecting "Deactivate review".
  • It can be set to the status "Closed" by selecting "Close review".

Requested (only for self assessments) .

  • If the review is requested by the lead auditor, it will be set to "Requested" status. The interview partner will be prompted via e-mail to perform the review.
  • The interview partner can set the status to "Answered" by clicking on "Submit review" after the review has been conducted.
  • Requested self assessments are marked with a badge.

Answered (only for self assessmentse) .

  • If the review is returned by the interview partner with "Submit review", it will be set to the status "Answered". The auditors will be prompted by an e-mail to check the response.
  • Answered self assessments are marked with a badge.
  • It can be returned to the status "Requested" by selecting "Request response" again. The interview partner must then revise their response.
  • It can be put back into the status "Draft" by selecting "Deactivate review" (only auditors will be notified).
  • It can be moved to the status "Closed" by selecting "Close review".

Closed

  • If the review is set to the "Closed" status by "Close review", it is read-only and it can no longer be edited.
  • Caution: A self assessment can only be closed if there is at least one interview partner.
  • Exception: Even in already closed reviews, measures and controls can still be added to or removed from review questions.

Delete a review.

  • With "Delete review" you can delete reviews that are not completed yet.
  • Caution: By deleting, the review objects created in this review as well as gaps already assigned to risks will also be deleted!

Change review type (interview <=> self assessment)

The type of review can be changed only in the "Draft" status. If the type is changed to "Self assessment", the end date changes to the reply deadline.

If the wrong type was set and the check was activated, the check must first be reset to the "Draft" status by "Deactivate check".

Tips, tricks & best practice

  • This type of analysis is a powerful tool in HITGuard. It is a central component of risk identification and treatment. Detected gaps can be linked with reduction measures and/or monitoring controls directly within the analysis.
  • The crucial benefit of doing this in one step is that any gaps, measures, and controls can be assigned to the identified risk. If the review object is also linked with a structural element, such as an application, this information is also comprehensibly shown in the details of that element.
  • Revaluating instead of evaluating again. From time to time, generally at regular intervals, the status quo should be ascertained again. For this, HITGuard offers the revaluation of analyses, which allows the updating of previous analyses instead of having to perform a completely new analysis. Previous answers can be viewed and even carried over. This makes the development of a review object even more apparent.

Review objects

Under "Risk management → Vulnerabilities → Review| Objects of review | Gaps | Clarification needed", you will find all the review objects that were created in the course of reviews in the current management system.

Overview of the review objects


Clicking on a review object opens the detailed view.

Edit review object


Here you can see how the review object was answered. Likewise, if several versions of the review object are available, you can view how the assessment of the review object has developed from one version to the next. Only the header data of a review object can be edited via this mask. This means that this mask cannot be used to answer a review object.

Initiate semi-automatic revaluation

Due to the implementation of measures, it can happen that review objects are proposed for semi-automatic revaluation. This always happens if the measure was either created in the course of a check for a review object or linked to a review object, the "after" value of the vulnerability reduction was set, and the measure is implemented. If a measure is implemented, the linked review objects are marked with "Revaluation recommended".

To avoid having to perform a new review every time a measure is implemented, HITGuard offers the option of subjecting these marked review objects to a semi-automatic revaluation. This means that HITGuard automatically updates the gap of the respective review questions of the review objects. A separate review is created for each individual organizational unit. In this process, the review questions that are affected by the implementation of measures are set to the "after" value of the vulnerability reduction.

Execution:

  1. Select review object.
  2. Click the orange arrow "Initiate semi-automatic revaluation".
  3. Select the gaps to be updated.
  4. Click the orange arrow "Perform revaluation for selected gaps".
Semi-automatic revaluation


Tips, tricks & best practice

Review objects can be evaluated multiple times within one analysis. For example, BSI's IT Grundschutz offers a module "Web applications". If various web applications are operated in an organization, it is recommended to answer the module for each one of them. This means the module should be selected multiple times across one or more reviews and linked to the respective resources. Additional tip: Give your review objects meaningful and telling names, such as "Web application 123_Cloud". This allows you to simply search the review objects when doing a revaluation.


Gaps

Under "Risk management → Vulnerabilities → Reviews | Objects of review | Gaps | Clarification needed", you will find all gaps that were identified during the performance of reviews.

Overview of gaps


The columns "Measure missing", "Target value missing", "Target value too low" can be used to find out against which gaps nothing or too little has been done. These gaps are tagged in the grid. If a gap does not have a tag, this means that attempts are being made to correct the gap.

Here you have the option to assign gaps that have not yet been assigned to a risk.

Double-clicking on a gap opens the review at the point where the gap was detected. Here, measures and controls for the gap can now be defined. For more information, see Answer review questions.

Optionally, it is possible to display a column that shows whether the line is a review question (from a knowledge base) or a review result (freely entered). This allows experts to then expand their self-developed knowledge bases by review results that are often added to reviews during the interview.

Filter gaps

Abweichungsfilter

With the filter, it can be selected which type of gaps is displayed:

  • negative: review questions/results that were evaluated < the target score
  • none: review questions/results that were evaluated = the target score
  • positive: review questions/results that were evaluated > the target score

Target score weighting

What the target score level is and where it is set can be found under Management systems. Wherever gaps occur, there is an additional form of sorting: the target score weighting. This is possible, for example, under "Risk management → Vulnerabilities → Gaps".

If activated, the sorting of protection targets is based on the target score weighting. The greater the deviation from the target score level and the greater the weighting of the protection target, the greater the target score weighting: target score weighting = deviation level * weighting of the protection target.

Note: A response of "No" corresponds to score level 1, "Partially" corresponds to score level 3.

Examples for illustration: Protection goal weighting: Mean (3).

  • score of deviation = 2, target score = 4 => Degree of deviation = 2, target score weighting = 2 * 3 = 6.
  • score of deviation = 4, target score = 4 => degree of deviation = 0, target score weighting = 0 * 3 = 0.
Apply target score weighting


Clarification needed

Under "Risk Management → Vulnerabilities → Reviews | Objects of review | Gaps| Clarification needed", you will find all review questions/review results that were marked with "Clarification needed" in the course of a review.

Overview of review questions/review results requiring clarification


This label is necessary in practice if you cannot yet clarify how the question is to be answered when answering a review question. This can happen if, for example, you would need to consult another person or otherwise research the information. Following a series of reviews, the system evaluates which questions still need to be researched. This is exactly what the "Clarification needed" view is for.

If you click on a review question/result, you will be redirected to it.

It is also possible to export a list of all review questions/results requiring clarification via the "Export" button (next to the search bar). This provides an easy-to-use list of the review questions that require clarification.