AlgoMorphism

General driven framework for Object Oriented Programming (OOP) on Back-Propagation Optimization algorithm.


Description & Deep Learning Experiment

AlgoMorphism (AM) is a framework for Object Oriented Programming (OOP) on Back-Propagation Optimization algorithm. Defining an optimization experiment using objects. Object examples could be Dataset Object (DO) or input

of experiment and Model Object (MO) which is optimized.


Firstly at an experiment call or develop a custom Dataset Object (DO), every DO has 1 to 3 elements:

Secondly, call or develop custom Cost Objects, Score Object:

Thirdly, call or develop custom Model Object (MO), every has one required element:

Also MO has 2 to 3 elements:

An MO Inherits the Base neural network Object (BO) with the status and DO as required elements. With BO, MO is becomes a trainable.


Finished the training, the experiment has completed. Plot the results per iteration (aka history), calling the `multiple_models_history_figure` function to plot Cost-Metric, Score per iteration (aka epoch).


Learning Graphs Example

In this learning experiment we try to classify using Graph Constitutional Networks, a random generated types of graphs with different number of nodes. We call the DO by framework, secondly we illustrative the MO graph classification learning and train the model. After the training we plot the leaning results.



Configure and call the generated `Simple Graph Dataset`

import algomorphis as am
import tensorflow as tf

examples = 500
n_nodes_min = 10
n_nodes_max = 20
graph_types = ['cycle', 'star', 'wheel', 'complete', 'lollipop',
                'hypercube', 'circular_ladder', 'grid']

n_c = len(graph_types)
g_dataset = am.datasets.generated_data.SimpleGraphsDataset(examples, n_nodes_min, n_nodes_max, graph_types)
a_train = g_dataset.get_train_data()
a_dim = a_train.shape[1]


Illustrate a custom `Graph Convolutional Classifier` MO

class GraphConvolutionalClassifier(tf.Module, am.base.BaseNeuralNetwork):
  def __init__(self):
    tf.Module.__init__(self, name='gcn_classifer')
    status = [
      [0],
      [1, 2]
    ]
    self.score_mtr = am.base.MetricBase(self,
      [ScoreMetricObject()],
      status=status,
      mtr_select=[0]
    )

    self.cost_mtr = am.base.MetricBase(self,
      [CostMetricObject()],
      status=status,
      mtr_select=[0],
      status_out_type=1
    )
    
    self.cost_loss = am.base.LossBase(self,
      [CostLossObject()],
      status=status,
      loss_select=[0]
    )
    
    am.base.BaseNeuralNetwork(status, g_dataset)
    
    self.gcn1 = am.layers.GCN(a_dim, 32)
    self.gcn2 = am.layers.GCN(32. 64)
    
    self.flatten = tf.keras.layers.Flatten()
    self.out = am.layers.FC(a_dim*64, n_c, 'softmax')

  def __call__(self, inputs):
    x = self.gcn1(inputs[0], inputs[1])
    x = self.gcn2(x, inputs[1])
    x = self.flatten(x)
    
    y = self.out(x)
    y = tuple([y])
    return y



Train MO

gcn = GraphConvolutionalClassifier()
gcn.train(g_dataset, epochs=150, print_types=['train', 'val'])


Plot History

am.figures.nn.multiple_models_history_figure([gcn])
Efthymis on 2022-01-20