Skip to content

Conversation

@eduardojsbarroso
Copy link
Contributor

No description provided.

@eduardojsbarroso
Copy link
Contributor Author

@combet I created this stage and it seems to be working but I have a couple questions. I checked that in the notebook example you provided here, you create already source galaxies behind the cluster. Since now we have source_tomo_bins and cluster_tomo_bins that might not be the same, I made a filter to see if the bins overlap. If there is any overlap, I do not compute the cross correlation.

But then I took the cross correlation of all source bins for a given cluster bin and computed the mean profile, IDK if this is the right approach (here is a pseudocode to explain what I am saying)

For cluster_bin in cluster_bins:
    For s_bin in source bins:
        tg_signal += s_bin.xi * npairs
    np.mean(tg_signal)

Also by doing this, I do not know how to properly compute the covariance. But I have a first code implementation that seems right, besides the cov computation. @marina-ricci If you have any thoughts on this, let me know.

An example of what I did is in TXPipe/notebooks/cluster_counts/gt_treecor_test/Run_CL_pipeline_treecor.ipynb

@eduardojsbarroso
Copy link
Contributor Author

@combet I created this stage and it seems to be working but I have a couple questions. I checked that in the notebook example you provided here, you create already source galaxies behind the cluster. Since now we have source_tomo_bins and cluster_tomo_bins that might not be the same, I made a filter to see if the bins overlap. If there is any overlap, I do not compute the cross correlation.

But then I took the cross correlation of all source bins for a given cluster bin and computed the mean profile, IDK if this is the right approach (here is a pseudocode to explain what I am saying)

For cluster_bin in cluster_bins:
    For s_bin in source bins:
        tg_signal += s_bin.xi * npairs
    np.mean(tg_signal)

Also by doing this, I do not know how to properly compute the covariance. But I have a first code implementation that seems right, besides the cov computation. @marina-ricci If you have any thoughts on this, let me know.

An example of what I did is in TXPipe/notebooks/cluster_counts/gt_treecor_test/Run_CL_pipeline_treecor.ipynb

I gave up on using the rlens metric since I do not have sources with this metric as well and the cross correlation does not work if the catalogs are not in the same metric. I also do not know how to generate the center_patches, so this stage is an initial implementation

@combet
Copy link
Contributor

combet commented Oct 20, 2025

@combet I created this stage and it seems to be working but I have a couple questions. I checked that in the notebook example you provided here, you create already source galaxies behind the cluster. Since now we have source_tomo_bins and cluster_tomo_bins that might not be the same, I made a filter to see if the bins overlap. If there is any overlap, I do not compute the cross correlation.
But then I took the cross correlation of all source bins for a given cluster bin and computed the mean profile, IDK if this is the right approach (here is a pseudocode to explain what I am saying)

For cluster_bin in cluster_bins:
    For s_bin in source bins:
        tg_signal += s_bin.xi * npairs
    np.mean(tg_signal)

Thanks @eduardojsbarroso. I don't know either if this is the right approach (I'm not very familiar with this) or if there are some extra subtleties to consider (pinging @tae-h-shin who might have some insight).

Do you have a way to merge the 3 catalogues from the 3 source tomographic bins into a single one, recompute, and compare the results to the averaging you do above? This would give us a hint if this is equivalent.

@eduardojsbarroso
Copy link
Contributor Author

@combet I created this stage and it seems to be working but I have a couple questions. I checked that in the notebook example you provided here, you create already source galaxies behind the cluster. Since now we have source_tomo_bins and cluster_tomo_bins that might not be the same, I made a filter to see if the bins overlap. If there is any overlap, I do not compute the cross correlation.
But then I took the cross correlation of all source bins for a given cluster bin and computed the mean profile, IDK if this is the right approach (here is a pseudocode to explain what I am saying)

For cluster_bin in cluster_bins:
    For s_bin in source bins:
        tg_signal += s_bin.xi * npairs
    np.mean(tg_signal)

Thanks @eduardojsbarroso. I don't know either if this is the right approach (I'm not very familiar with this) or if there are some extra subtleties to consider (pinging @tae-h-shin who might have some insight).

Do you have a way to merge the 3 catalogues from the 3 source tomographic bins into a single one, recompute, and compare the results to the averaging you do above? This would give us a hint if this is equivalent.

Thank you for the feedback. Yes I can do this, I was just thinking that I do not know if it is the right approach also because we would have a calibration factor for the shear of each bin, so if I do a cross correlation with "1 bin", it is also a good way to do it. I can try this and also compare to the tangential shear from the individual profiles to see if all is making sense, but I will have to this in a couple of weeks since now I have some other urgent matters to take care :/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Implement Cluster reduced shear measurement using pos x shear correlation using TreeCorr

3 participants