Today, I’ll redo something that one of my students (Emre Yuksel) did in his senior graduating thesis: I’ll construct neural network models to detect regularity and bipartite-ness in graphs. To be clear: I am doing similar experiments without using Emre’s code to see if we get similar results.
A graph is called regular if every vertex has the same degree. On the other hand a graph is called bipartite if one can color vertices in two colors such that no pair vertices of the same color are connected by an edge.
Let us start by importing our libraries:
import tensorflow.keras as ks
import networkx as nx
import numpy as np
from random import random
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_scoreI am going to need two functions that return a mixture of random graphs and graphs of certain type.
def randomGraphs(n,m,p,q,regular=True):
X = []
y = []
for i in range(n):
s = random()
r = p + (q-p)*random()
cls = 1
if s<0.5:
res = nx.random_graphs.fast_gnp_random_graph(m,r)
cls = 0
elif regular:
res = nx.random_graphs.random_regular_graph(int(m*r),m)
else:
ell = int(s*m)
res = nx.bipartite.random_graph(ell,m-ell,r)
X.append(np.array(nx.adjacency_matrix(res).toarray().flatten()))
y.append(cls)
return (np.array(X),np.array(y))And here is the code for the experiment itself:
def Experiment(N,m,p,q,regular=True,epochs=10):
X,y = randomGraphs(N,m,p,q,regular)
x_train, x_test, y_train, y_test = train_test_split(X,y)
model = ks.Sequential()
model.add(ks.layers.Dense(m*m,activation='relu'))
model.add(ks.layers.Dropout(0.5))
model.add(ks.layers.Dense(m,activation='relu'))
model.add(ks.layers.Dropout(0.25))
model.add(ks.layers.Dense(1,activation='sigmoid'))
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
model.fit(x_train, y_train, epochs=epochs, batch_size=int(N/epochs))
y_pred = model.predict_classes(x_test)
return accuracy_score(y_pred,y_test)print(Experiment(2000,50,0.2,0.3,True,25))Epoch 1/25
19/19 [==============================] - 1s 52ms/step - loss: 0.7299 - accuracy: 0.5740
...
Epoch 25/25
19/19 [==============================] - 1s 49ms/step - loss: 0.0354 - accuracy: 0.9900
0.908print(Experiment(1200,50,0.2,0.3,False,10))Epoch 1/10
8/8 [==============================] - 1s 50ms/step - loss: 0.6087 - accuracy: 0.7289
...
Epoch 10/10
8/8 [==============================] - 0s 51ms/step - loss: 0.0046 - accuracy: 1.0000
1.0First experiment yielded 91% accuracy while the second yielded 100%. It looks like using Neural Network models to detect graph properties does work!