Today, I’ll redo something that one of my students (Emre Yuksel) did in his senior graduating thesis: I’ll construct neural network models to detect regularity and bipartite-ness in graphs. To be clear: I am doing similar experiments without using Emre’s code to see if we get similar results.
A graph is called regular if every vertex has the same degree. On the other hand a graph is called bipartite if one can color vertices in two colors such that no pair vertices of the same color are connected by an edge.
Let us start by importing our libraries:
import tensorflow.keras as ks
import networkx as nx
import numpy as np
from random import random
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
I am going to need two functions that return a mixture of random graphs and graphs of certain type.
def randomGraphs(n,m,p,q,regular=True):
= []
X = []
y for i in range(n):
= random()
s = p + (q-p)*random()
r = 1
cls if s<0.5:
= nx.random_graphs.fast_gnp_random_graph(m,r)
res = 0
cls elif regular:
= nx.random_graphs.random_regular_graph(int(m*r),m)
res else:
= int(s*m)
ell = nx.bipartite.random_graph(ell,m-ell,r)
res
X.append(np.array(nx.adjacency_matrix(res).toarray().flatten()))
y.append(cls)return (np.array(X),np.array(y))
And here is the code for the experiment itself:
def Experiment(N,m,p,q,regular=True,epochs=10):
= randomGraphs(N,m,p,q,regular)
X,y = train_test_split(X,y)
x_train, x_test, y_train, y_test
= ks.Sequential()
model *m,activation='relu'))
model.add(ks.layers.Dense(m0.5))
model.add(ks.layers.Dropout(='relu'))
model.add(ks.layers.Dense(m,activation0.25))
model.add(ks.layers.Dropout(1,activation='sigmoid'))
model.add(ks.layers.Dense(compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
model.=epochs, batch_size=int(N/epochs))
model.fit(x_train, y_train, epochs
= model.predict_classes(x_test)
y_pred return accuracy_score(y_pred,y_test)
print(Experiment(2000,50,0.2,0.3,True,25))
1/25
Epoch 19/19 [==============================] - 1s 52ms/step - loss: 0.7299 - accuracy: 0.5740
...25/25
Epoch 19/19 [==============================] - 1s 49ms/step - loss: 0.0354 - accuracy: 0.9900
0.908
print(Experiment(1200,50,0.2,0.3,False,10))
1/10
Epoch 8/8 [==============================] - 1s 50ms/step - loss: 0.6087 - accuracy: 0.7289
...10/10
Epoch 8/8 [==============================] - 0s 51ms/step - loss: 0.0046 - accuracy: 1.0000
1.0
First experiment yielded 91% accuracy while the second yielded 100%. It looks like using Neural Network models to detect graph properties does work!