Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add quality check of calibration results #25

Merged
merged 24 commits into from
Dec 14, 2021
Merged

Conversation

daniel-mills-cqc
Copy link
Collaborator

This adds a check on the quality of the calibration results. In particular it checks if a large fraction of the noisy expectation values of the calibration circuits are close to that of the original circuit. The training procedure is more accurate when a large fraction of the calibration circuits have expectation values close to that of the original.

@daniel-mills-cqc daniel-mills-cqc marked this pull request as ready for review December 13, 2021 11:27
Copy link
Collaborator

@sjdilkes sjdilkes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Graph wiring nice, just a couple of bits

_experiment_taskgraph.parallel(_states_sim_taskgraph)
_post_task_graph = TaskGraph(_label="QualityCheckCorrect")
_post_task_graph.parallel(_post_calibrate_task_graph)
_post_task_graph.prepend(cdr_quality_check_task_gen(distance_tolerance=kwargs.get("distance_tolerance", 0.1)))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add this as a "key" parameter in documentation

obj,
noisy_expectation: List[QubitPauliOperator],
state_circuit_exp: List[List[Tuple[QubitPauliOperator, QubitPauliOperator]]],
):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-> Tuple[whatever typing is]


# Raise a warning if the calibration circuits regularly have noisy
# expectation value far from the original circuit.
if is_far_count > len(calibration) / 2:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is 50% a reliable proportion, or should it be possible to pass this ratio as a parameter?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, a parameter would probably be better.

@@ -123,7 +127,7 @@ def cdr_quality_check_task(

# Raise a warning if the calibration circuits regularly have noisy
# expectation value far from the original circuit.
if is_far_count > len(calibration) / 2:
if is_far_count > len(calibration) * calibration_fraction:
warnings.warn(
"Training data regularly differers significantly from original circuit. Fit may be poor."
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

differers -> differs

sjdilkes
sjdilkes previously approved these changes Dec 14, 2021
@daniel-mills-cqc daniel-mills-cqc merged commit 763d0b8 into master Dec 14, 2021
@daniel-mills-cqc daniel-mills-cqc deleted the add_quality_check branch December 14, 2021 11:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants