Optimisation: Difference between revisions

No edit summary
Line 6: Line 6:
* at least VIEW FRAME.pck to check the agreement between predicted and observed spots on last frame of dataset. It would be wise to also VIEW MODPIX.pck, VIEW DECAY.pck to get an impression about systematic effects in your data. When doing this, take the scale in the right bar into account!
* at least VIEW FRAME.pck to check the agreement between predicted and observed spots on last frame of dataset. It would be wise to also VIEW MODPIX.pck, VIEW DECAY.pck to get an impression about systematic effects in your data. When doing this, take the scale in the right bar into account!


== Further optimization based on XDSSTAT output ==
== Further optimization based on [[XDSSTAT]] output ==


* inspect the table of R_meas values (the lines ending with 'L' and decide whether you want to remove any specific frames (by appending .bad to the filenames, and re-running INTEGRATE and CORRECT)
* inspect the table of R_meas values (the lines ending with 'L' and decide whether you want to remove any specific frames (by appending .bad to the filenames, and re-running INTEGRATE and CORRECT)
* inspect the table of R_d values (the lines ending with 'DIFFERENCE') and find out if you have systematically rising R_d which would be an indication of strong radiation damage. This works best in high-symmetry space groups.
* inspect the table of R_d values (the lines ending with 'DIFFERENCE') and find out if you have systematically rising R_d which would be an indication of strong radiation damage. This works best in high-symmetry space groups.
* inspect the table of R_meas [[versus]] PEAK and ln(intensity) and consider adjusting MINPK (the threshold for rejecting overlaps) to a higher value. For better data, you want to raise MINPK to say 85, 90 or even 95, but of course this will reduce the completeness. Find the right compromise between completeness and data quality for your purposes! Experimental phasing relies on high accuracy (in particular of the strong reflections), whereas maps and refinement benefit from good completeness.
* inspect the table of R_meas ''versus'' PEAK and ln(intensity) and consider adjusting MINPK (the threshold for rejecting overlaps) to a higher value. For better data, you want to raise MINPK to say 85, 90 or even 95, but of course this will reduce the completeness. Find the right compromise between completeness and data quality for your purposes! Experimental phasing relies on high accuracy (in particular of the strong reflections), whereas maps and refinement benefit from good completeness.
* inspect the .pck written by XDSSTAT, in particular scales.pck, rf.pck, anom.pck and decide if you are happy with e.g. your low-resolution cutoff! It is normal for scales.pck to have alternating white and black at high resolution, and it is also normal for rf.pck to be bright (high R-factors) at high resolution. But you definitively don't want such indications at low resolution.


== Final polishing ==
== Final polishing ==
2,651

edits