-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathGLOBIOM-G4M-link.Rmd
208 lines (146 loc) · 5.84 KB
/
GLOBIOM-G4M-link.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
---
title: "GLOBIOM-G4M-link"
subtitle: R notebook for automatizing the link between GLOBIOM and G4M.
author:
- Andrey L. D. Augustynczik
- Albert Brouwer
date: "`r format(Sys.time(), '%d %B %Y')`"
output: html_notebook
---

Before using this notebook, please first read the [introductory documentation](https://iiasa.github.io/GLOBIOM-G4M-link). Run the code chunks in first-to-last order. You can have RStudio run all chunks in one go, but to configure and test your setup---and because some chunks take a long time to run---it is advisable to start out by running chunks one-by-one at least until you have verified that everything is working smoothly.
Carefully check the output of each chunk for errors, and if present make adjustments and re-run the chunk that failed.
## Read the default configuration and list of optional configuration settings
```{r}
# Remove any objects from environment and read the default configuration
rm(list=ls())
source("R/configuration/default.R")
# Collect the names and types of the default config parameters
config_names <- ls()
config_types <- lapply(lapply(config_names, get), typeof)
# Read list of optional config settings and remove mandatory settings from the environment
source("R/configuration/optional.R")
rm(list=config_names[!(config_names %in% OPTIONAL_CONFIG_SETTINGS)])
```
## Load required packages
```{r}
library(gdxrrw)
suppressWarnings(library(tidyverse))
library(stringr) # part of the tidyverse, but not attached by default
library(fs)
```
## Customize configuration
```{r}
custom_configuration_file <- "R/configuration/custom.R"
# Check that a custom configuration exists, edit a new one otherwise.
if (!file_exists(custom_configuration_file)) {
warning("Custom configuration file missing, please customize.")
file_copy("R/configuration/default.R", custom_configuration_file)
rstudioapi::navigateToFile(custom_configuration_file)
}
```
## Load and check custom configuration
When all if fine, the configuration is echoed.
```{r}
source(custom_configuration_file, local=TRUE, echo=FALSE)
source("R/configuration/check_and_echo.R")
```
## Check that submodules and a GLOBIOM branch are available
When executing this chunk fails, [read this](https://iiasa.github.io/GLOBIOM-G4M-link/#getting-set).
```{r}
if (!file_exists(path("Condor_run_R", "Condor_run.R"))) stop ("Condor_run_R submodule content is missing!")
if (!file_exists(path(WD_DOWNSCALING, "README.md"))) stop ("DownScaling submodule content is missing!")
if (!dir_exists(path(WD_GLOBIOM, "Model"))) stop ("No GLOBIOM branch has been checked out!")
```
## Set up directories and functions
```{r}
# Cache working directory, should equal the root directory of the repo's working tree.
CD <- getwd()
# Canonicalize path to temporary directory for this R session
TEMP_DIR <- path(tempdir())
# Source support functions
source("R/run_submission.R")
source("R/transfer.R")
source("R/post_processing.R")
source("R/helper_functions.R")
# Compile G4M
compile_g4m_model()
# Compile G4M output processing library
compile_table_merger()
# Load dll with G4M report generation routine
dyn.load(path(CD,WD_G4M,"/tableMergeForLinker/merge_files.so"))
```
## 1. Run GLOBIOM scenarios
```{r}
cluster_nr_globiom <- run_globiom_scenarios()
print("GLOBIOM scenarios run complete.")
```
## 2. Run initial post-processing
Merge and export GLOBIOM output for use by downscaling and G4M.
```{r}
run_initial_postproc(cluster_nr_globiom)
print("Initial post processing complete.")
```
## 3a. Perform the initial downscaling - needs data from the initial GLOBIOM run
```{r}
if (!file_exists(path(WD_DOWNSCALING,"input",str_glue("output_landcover_{PROJECT}_{DATE_LABEL}.gdx"))))
stop("File for downscaling not found! Please call the intial GLOBIOM run before downscaling")
cluster_nr_downscaling <- run_initial_downscaling()
print("Initial downscaling complete")
```
## 3b. Merge downscaling outputs and transfer them to G4M
```{r}
# Transfer downscaled output to G4M input folder
# For large simulations parallelize file transfer
if (length(SCENARIOS_FOR_DOWNSCALING) < 50){
merge_and_transfer(cluster_nr_downscaling)
} else {
run_merge_and_transfer(cluster_nr_downscaling)
}
```
## 3c. Generate CSV input files for G4M
```{r}
if (!file_exists(path(PATH_FOR_G4M, str_glue("GLOBIOM2G4M_output_LC_abs_{PROJECT}_{DATE_LABEL}_1.csv"))))
stop("File for G4M run not found! Please perform intial downscaling first.")
gdx_to_csv_for_g4m()
```
## 4. Preliminary: Run baseline G4M scenarios---needs data from the intial downscaling
```{r}
run_g4m(baseline = TRUE)
print("G4M baseline scenario run complete")
```
## 5a. Preliminary: Run G4M scenarios---needs data from the intial downscaling and G4M baseline scenarios
```{r}
allfiles <- dir_ls(path(CD, str_glue("{WD_G4M}"), "out", str_glue("{PROJECT}_{DATE_LABEL}"), "baseline"))
base_files <- str_detect(allfiles,"NPVbau_[:print:]+.bin")
if (!any(base_files)) stop("Baseline G4M run not found! Please run the baseline before the G4M scenarios")
# Run G4M - final scenarios
run_g4m(baseline = FALSE)
print("G4M scenario run complete")
```
## 5b. Preliminary: Compile G4M data and produce the report for GLOBIOM---needs data from the G4M run
```{r}
compile_g4m_data(baseline = FALSE)
```
## 6. Preliminary: Perform the final post-processing---needs data from the G4M run
```{r}
if (USE_LIMPOPO_POSTPROC) {
run_final_postproc_limpopo(cluster_nr_globiom)
} else {
run_final_postproc(cluster_nr_globiom)
}
print("Final post-processing complete")
```
## 7. Remove files and plot results
```{r}
# Generate plots for results
if (GENERATE_PLOTS) plot_results(SCENARIOs_PLOT_LOOKUP,SCENARIOS_PLOT_GLOBIOM)
# Remove global environment files
clear_environment()
# Remove GLOBIOM gdx files
clear_glob_files()
# Remove Downscaling gdx files
clear_downs_files()
# Remove G4M output files
clear_g4m_files()
```