Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sensitivity analysis #113

Open
wants to merge 50 commits into
base: development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
fcf304e
Adding sensitivity analysis folder and files, minus batch script
Aug 21, 2024
1608c1d
Merged development into sensitivity-analysis
Aug 22, 2024
692cf7a
Added beta distribution to initialization and made necessary fun conc…
Aug 22, 2024
a651514
Not yet ready, in middle of changes for Snellius run.
Aug 23, 2024
f5571da
fixed lazy path errors, now pushing to test on Snellius again
Aug 23, 2024
31f8557
Batch script now runs on Snellius
Aug 26, 2024
ce253ac
Changed some print statements
Aug 27, 2024
cb71315
De-linting config.py
Aug 27, 2024
8d1f9b0
I think I don't have the right lint roller, I am reverting it back to…
Aug 27, 2024
390a62d
Added network metrics as node properties to make it possible to not s…
Sep 3, 2024
f75280b
Altered pytest to match updated network metrics functions; removed a …
Sep 3, 2024
6897cc5
Added further data collection options for any user who wants to save …
Sep 4, 2024
dced6d9
Attempt to fix the build fail. Reverting with next commit if unsuccce…
Sep 4, 2024
420ceea
Sorry about the build fail; I know it is probably for a reason, so I …
Sep 4, 2024
7bf41ea
I am trying to resolve the build fail again.
Sep 4, 2024
aceedaf
Now writes model_theta values to config.
Sep 9, 2024
8d90489
initialize_model_properties is no longer part of the model object. Fu…
Sep 10, 2024
e44b8e5
Trade money now trades money. PTM step now updates with theta of t-1…
Oct 10, 2024
edb37d3
Attempt to fix pytest.
Oct 10, 2024
debd316
Bellman equation NN update
Nov 18, 2024
4546f4e
Added GPU availability condition to run file
Nov 18, 2024
55c9112
added scheduled shock argument, fixed bug in model identifier
Nov 19, 2024
9d8e17b
fixed typo in SBATCH time
Nov 19, 2024
444bd7e
corrected environment name for Snellius
Nov 19, 2024
ad49889
attempting fix of environment setup on Snellius
Nov 21, 2024
8cbdad3
Correcting bash script file reference for Snellius
Nov 21, 2024
c061769
Re-ordered global theta in initialization to fix seed responsivenes
Nov 21, 2024
1ec12ec
Added profiling request to bash script
Nov 21, 2024
c5aa698
Changed folder path
Nov 21, 2024
9722c73
Added initialization of income and technology index removed unused di…
Nov 25, 2024
4189c7a
added null-variety experiment scripts
Jan 3, 2025
6e7f825
changed conda source directory due to very annoying Snellius permissi…
Jan 7, 2025
788d343
Revert "changed conda source directory due to very annoying Snellius …
Jan 7, 2025
66ab367
changed conda source directory due to very annoying Snellius permissi…
Jan 7, 2025
eafb6e2
fix bug in no adaptation version of wealth consumption
Jan 7, 2025
288b499
debugging on Snellius with no_adapt_run script
Jan 7, 2025
f66eb24
Adding tool to check equal initialization for experimental setups.
Jan 13, 2025
166fe94
Adding 4-panel plot tool.
Jan 16, 2025
1653ee1
Added data processor.
Jan 19, 2025
43cb124
added generic bash script that accepts python file as argument.
Jan 19, 2025
eaecd0a
pushed updated codes actually used in Snellius
Jan 20, 2025
f19c674
Added new data request for aggregated wealth and consumption in data_…
Jan 20, 2025
a507737
Fixed consumption typo in data_processor.py
Jan 20, 2025
250172d
testing i_a data output on snellius
Jan 20, 2025
577a525
Adding seed-wise mutual information calculation and zero variance skip
Jan 21, 2025
e6e1102
histogram data for timestep 25
Jan 22, 2025
f125340
histogram data for timestep 74
Jan 22, 2025
b1f6c6f
Trying to write directly to research drive from Snellius.
Jan 25, 2025
35da21d
Trying to write directly to research drive from Snellius.
Jan 25, 2025
8982eb3
Wasserstein distance added.
Jan 28, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Added beta distribution to initialization and made necessary fun conc…
…essions in config. Added sensitivity analysis related things. This version has been tested locally only.
  • Loading branch information
Victoria authored and Victoria committed Aug 22, 2024
commit 692cf7ae510665d320e16500f30101a77ccff86c
24,578 changes: 12,289 additions & 12,289 deletions Sensitivity_Analysis/SaltelliSampleParams-n1024.csv

Large diffs are not rendered by default.

9 changes: 6 additions & 3 deletions Sensitivity_Analysis/SampleGenerator.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,19 +16,22 @@
# event_alpha 4.13+/-50%
# event_beta 0.07+/-50%

problem={ "names": ["homophily","local_ratio","noise_ratio","event_alpha","event_beta"],
problem={ "names": ["homophily","local_ratio","noise_ratio","shock_alpha","shock_beta"],
"num_vars":5,
"bounds":[[0,10],[0.2,0.3],[0.04,0.06],[4.13*0.5,4.13*1.5],[0.07*0.5,0.07*1.5]],
"dists":["unif","unif","unif","unif","unif"]}

S_sample=sobol.sample(problem,n)

S_sampledf=pd.DataFrame(S_sample, columns=["homophily","local_ratio","noise_ratio","event_alpha","event_beta"])
S_sampledf=pd.DataFrame(S_sample, columns=["homophily","local_ratio","noise_ratio","shock_alpha","shock_beta"])

S_sampledf.index.name="RunID"
# drop duplicates
S_sampledf=S_sampledf.drop_duplicates()

# reindex
S_sampledf.index = pd.RangeIndex(start=1, stop=len(S_sampledf) + 1, step=1)
S_sampledf.index.name="RunID"

S_sampledf.to_csv("SaltelliSampleParams-n1024.csv")


67 changes: 67 additions & 0 deletions Sensitivity_Analysis/SensitivityAnalysis.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
import sys
sys.path.append('/Users/victoria/Documents/Scripts/Python/DGL-PTM/dgl_ptm')
import dgl_ptm
import os
import torch
import argparse
import collections
os.environ["DGLBACKEND"] = "pytorch"

# keeping None as default to make error obvious
parser = argparse.ArgumentParser(description='Sensitivity Analysis')
parser.add_argument('-r', '--run_id', default=None, type=int, help='Run ID from sample file (default: None)')
parser.add_argument('-s', '--seed', default=None, type=int, help='Seed (default: None)')
parser.add_argument('-ho', '--homophily', default=None, type=float, help='Homophily (default: None)')
parser.add_argument('-l', '--local', default=None, type=float, help='Local attachment ratio (default: None)')
parser.add_argument('-n', '--noise', default=None, type=float, help='Noise ratio (default: None)')
parser.add_argument('-a', '--shock_a', default=None, type=float, help='Shock alpha (default: None)')
parser.add_argument('-b', '--shock_b', default=None, type=float, help='Shock beta (default: None)')

args =parser.parse_args()

model = dgl_ptm.PovertyTrapModel(model_identifier=f'batchtest_{args.run_id}',)

model.set_model_parameters(**{'number_agents': 100 ,
'seed':args.seed,
'gamma_vals':torch.tensor([0.3,0.45]) , #for pseudo income
'sigma_dist': {'type':'uniform','parameters':[0.05,1.94],'round':True,'decimals':1},
'cost_vals': torch.tensor([0.,0.45]), #for pseudo income
'a_theta_dist': {'type':'uniform','parameters':[0.1,1],'round':False,'decimals':None},
'sensitivity_dist':{'type':'uniform','parameters':[0.0,1],'round':False,'decimals':None},
'capital_dist': {'type':'uniform','parameters':[0.1,10.],'round':False,'decimals':None},
'alpha_dist': {'type':'normal','parameters':[1.08,0.074],'round':False,'decimals':None},
'lambda_dist': {'type':'uniform','parameters':[0.05,0.94],'round':True,'decimals':1},
'initial_graph_type': 'barabasi-albert',
'initial_graph_args': {'seed': 1, 'new_node_edges':1},
'device': 'cpu',
'step_target':20,
'steering_parameters':{'npath':'./agent_data.zarr',
'epath':'./edge_data',
'ndata':['all_except',['a_table']],
'edata':['all'],
'mode':'w',
'wealth_method':'weighted_transfer',
'income_method':'income_generation',
'tech_gamma': torch.tensor([0.3,0.35,0.45]),
'tech_cost': torch.tensor([0,0.15,0.65]),
'consume_method':'past_shock_bellman_consumption',
'nn_path': "/nn_data/both_PudgeSixLayer_1024/0723_110813/model_best.pth",
'adapt_m':torch.tensor([0,0.5,0.9]),
'adapt_cost':torch.tensor([0,0.25,0.45]),
'depreciation': 0.6,
'discount': 0.95,
'm_theta_dist': {'type':'beta','parameters':[args.shock_a,args.shock_b],'round':False,'decimals':None},
'del_method':'size',
'noise_ratio': args.noise,
'local_ratio': args.local,
'homophily_parameter': args.homophily,
'characteristic_distance':3.33,
'truncation_weight':1.0e-10,
'step_type':'custom'}})

print(model.config.steering_parameters)

model.initialize_model()
print (model.steering_parameters['modelTheta'] )
model.run()

46 changes: 46 additions & 0 deletions Sensitivity_Analysis/SensitivityAnalysis.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
#!/bin/bash
#SBATCH --job-name=SA1_seed1
#SBATCH -p gpu
#SBATCH --gpus=1

# d-hh:mm:ss
#SBATCH --time=05:30:00


module load 2023
conda activate dgl_ptm_gpu



# Sensitivity Analysis


seed=1
counter=0
start=1
earlystop=3
sample_csv="SaltelliSampleParams-n1024.csv"

while IFS=, read -r RunID homophily local_ratio noise_ratio shock_alpha shock_beta; do
# Check range compliance
if [ "$earlystop" != "None" ] && [ "$RunID" -gt "$earlystop" ]; then
break
fi
# Check range compliance and print parameters
if [ "$RunID" -ge "$start" ]; then
echo Run: "$RunID"
echo Homophily: "$homophily" Local Ratio: "$local_ratio" Noise Ratio: "$noise_ratio" Shock Alpha: "$shock_alpha" Shock Beta: "$shock_beta"
python SensitivityAnalysis.py --seed "$seed" --run_id "$RunID" --homophily "$homophily" --local "$local_ratio" --noise "$noise_ratio" --shock_a "$shock_alpha" --shock_b "$shock_beta"


fi

done < "$sample_csv"
wait
echo " $(date) - Runs $start through $earlystop are complete for seed $seed."






220 changes: 220 additions & 0 deletions Sensitivity_Analysis/beta_distribution.ipynb

Large diffs are not rendered by default.

Loading
Loading