This repository contains companion code for the paper Graph based sparse neural networks for traffic signal optimization, submitted for NeurIPS Conference 2019.
Trained models from our experiments can be found in the file 100k_fit_and_eval_experiment_results.zip that can be obtained from https://drive.google.com/file/d/1zch1oqMd_VEPSqm42I8njT2cJ9mN96Xi/view?usp=sharing and should be unzipped to the repository main directory.
The core dataset can be downloaded from https://drive.google.com/file/d/1UHGcHeJrxskkJ7ib9oe9oSz21nRnrCVN/view?usp=sharing and should be saved in the repository main directory.
The toy dataset can be found in the archive toy_set.zip, available from https://drive.google.com/file/d/1y-25uIiPRQb7zUNQshrh8UXtegLKg25e/view?usp=sharing. The archive should be unzipped to the repository main directory.
Core modules:
graph_neural_networks.pycontains basic methods for constructing graph neural networks. This is the core of the whole repositorygraph_utils.pycontains several utility methods for working with graphsdata_preparation_utils.pycontains simple methods for preparing data for training (including scaler initialization)training_and_evaluation.pycontains methods for training and evaluation of models, reused many times in our experimentsiterative_updaters.pycontains a few basic iterative updaters for use in gradient descent experimentstoy_examples.pycurrently contains core code for a single toy problem, related to neighbourhood entropy (similar to (2) in t-SNE paper (van der Maaten, L. and Hinton, G. (2008)))
Experiment scripts:
100k_fit_experiment.pycontains code for running main experiment with multiple graph nn architectures of type 1 (Table 1 in the paper). Fits models.100k_eval_experiment.pycontains code for running main experiment with multiple graph nn architectures of type 1 (Table 1 in the paper). Evaluates models produced by100k_fit_experiment.py.100k_fit_experiment_kipf.pycontains code for running main experiment with multiple graph nn architectures of type 2 (Table 2 in paper). Fits models.100k_eval_experiment_kipf.pycontains code for running main experiment with multiple graph nn architectures of type 2 (Table 1 in paper). Evaluates models produced by100k_fit_experiment_kipf.py.100k_fit_experiment_feedforward.pycontains code for running reference experiment with a few feedforward architectures (Table 3 in paper). Fits models.100k_eval_experiment_feedforward.pycontains code for running reference experiment with a few feedforward architectures (Table 3 in paper). Evaluates models produced by100k_fit_experiment_feedforward.py.random_topologies_experiment.pycontains code for running random graph experiments for graph nns of type 1 (method 1, Figure 1(a) in the paper).permuted_topologies_experiment.pycontains code for running random (permuted) graph experiments for graph nns of type 1 (method 2, Figure 1(b) in the paper).random_topologies_experiment_kipf.pyis a analogue ofrandom_topologies_experiment.pyfor graph nns of type 2 (check Supplementary materials for results)permuted_topologies_experiment_kipf.pyis a analogue ofpermuted_topologies_experiment.pyfor graph nns of type 2 (check Supplementary materials for results)toy_random_topologies_experiment.pyis an analogue ofrandom_topologies_experiment.pyfor our toy problem (check Supplementary materials for results)toy_permuted_topologies_experiment.pyis an analogue ofpermuted_topologies_experiment.pyfor our toy problem (check Supplementary materials for results)
Notebooks:
graph_nn_type_1_tests.ipynbis a notebook documenting a number of experiments related to type 1 graph nns, including preliminary results, random graph experiment summary, and some nn visualization (not included in the paper)graph_nn_type_2_tests.ipynbis an analogue ofgraph_nn_type_1_tests.ipynbfor type 2 networks.feedforward_nn_tests.ipynbcontains first test results for feedforward neural networks, in preparation to produce Table 3 in the paper.fit_and_evaluate_experiments_summary_type_1.ipynbwas used for producing Table 1 in the paper, as well as two tables included in Supplementary materials.fit_and_evaluate_experiments_summary_type_2.ipynbis a full analogue offit_and_evaluate_experiments_summary_type_1.ipynbfor graph nns of type 2. Table 2 was obtained using this code, as well two further tables included in Supplementary materials.fit_and_evaluate_experiments_summary_feedforward.ipynbis a full analogue offit_and_evaluate_experiments_summary_type_1.ipynbfor fully connected feedforward networks. Table 3 was obtained using this code, as well two further tables included in Supplementary materials.
Result files:
100k_fit_and_evaluate_experiments/fit_eval_results.csvcontains basic results of the main experiment for type 1 graph nns. These results are now part of Table 1 in the paper.100k_fit_and_evaluate_experiments_kipf/fit_eval_results_kipf.csvis an analogue of100k_fit_and_evaluate_experiments/fit_eval_results.csvfor type 2 networks. These results are now part of Table 2 in the paper.100k_feedforward_fit_and_evaluate_experiments/fit_eval_results.csvis an analogue of100k_fit_and_evaluate_experiments/fit_eval_results.csvfor fully connected feedforward nns. These results are now part of Table 3 in the paper.random_topologies_3_2.csvcontains results of random graph experiments for type 1 graph nns (method 1, Figure 1(a) in the paper)permuted_topologies_0.csvcontains results of random graph experiments for type 1 graph nns (method 2, Figure 1(b) in the paper)random_topologies_3_2_kipf.csvis an analogue ofrandom_topologies_3_2.csvfor type 2 networks. Used for producing a plot in the Supplementary materials for the paper.permuted_topologies_0_kipf.csvis an analogue ofpermuted_topologies_0.csvfor type 2 networks. Used for producing a plot in the Supplementary materials for the paper.toy_random_topologies_0.csvcontains results of random graph experiments for type 1 graph nns based on our toy problem (graph randomization method 1). Used for producing a plot in the Supplementary materials for the paper.toy_permuted_topologies_0.csvcontains results of random graph experiments for type 1 graph nns based on our toy problem (graph randomization method 2). Used for producing a plot in the Supplementary materials for the paper.
Additional:
macierz_sasiedztwa.txtcontains our traffic optimization problem adjacency matrix. It is a directed, nonsymmetric matrix, but we symmetrize it in all our experiments by extracting undirected edges. "Macierz sąsiedztwa" just means "adjacency matrix" in Polish.environment.txtcontains the list of python packages in the Python environment we used on AWS.- PNG files contain the plots included in the paper and its Supplementary materials.
NOTE: Type 2 neural networks were first dubbed "Kipf's networks" by us due to a formal analogy to the networks discussed in (Kipf, T. N. and Welling, M. (2017)). The name stuck, which is visible in several files in the repository.
Here is an exemplary (minimalistic) code for constructing a graph neural network of type 1:
import numpy as np
import tensorflow as tf
import graph_utils as graph_utils
import graph_neural_networks as graph_nn
adj_matrix = np.genfromtxt("macierz_sasiedztwa.txt")
nn_input = tf.placeholder(dtype=tf.float32, shape=[None, 21])
nn_output = graph_nn.transfer_matrix_neural_net(nn_input, 3, 4, tf.nn.tanh, adj_matrix, verbose=True)- van der Maaten, L. and Hinton, G. (2008) Visualizing Data using t-SNE, Journal of Machine Learning Research, vol. 9, pp. 2579--2605
- Kipf, T. N. and Welling, M. (2017) Semi-Supervised Classification with Graph Convolutional Networks, 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings