In this tutorial we will create a sequential model by passing a list of layer instances to the constructor of the **Sequential** class:

```
from keras.models import Sequential
from keras.layers import Dense, Activation
model = Sequential([Dense(32, input_shape=(784,)), Activation('relu'), Dense(10), Activation('softmax'),])
```

You can also simply add layers using the **add()** method.

```
model = Sequential()
model.add(Dense(32, input_dim=784))
model.add(Activation('relu'))
```

**Specifying the dimension of the input data**

The model must know the dimension of the input data array. Therefore, the first layer in the Sequential model (and only the first, because the next layers can automatically receive this information from the previous layer) should receive information about the dimension of the input array. There are several ways to do this:

- Pass the input_shape argument to the first layer.
- Some 2D layers, for example, Dense, support the specification of their input form via the input_dim argument, and some 3D layers support the input_dim and input_length arguments.

Thus, the following code fragments are strictly equivalent:

```
model.add (Dense (32, input_shape = (784,)))
model.add (Dense (32, input_dim = 784))
```

**Model compilation**

Before preparing the model, you must configure the learning process that is performed using the compile method. This method has three arguments:

**Optimizer**. This can be a string identifier of an existing optimizer (for example, rmsprop or adagrad) or an instance of the class Optimizer.**Error function**. This is the goal that the model will try to minimize. It can be a string identifier for an existing error function (for example, categorical_crossentropy or mse) or it can be a target function.**Metrics.**For any classification problem you will want to set this metrics = [‘accuracy’]. The metric can be a string identifier of an existing metric or a special metric function.

Examples:

- For the task of classifying by several classes

**model.compile (optimizer = ‘rmsprop’, loss = ‘categorical_crossentropy’, metrics = [‘accuracy’])**

- For the problem of binary classification

**model.compile (optimizer = ‘rmsprop’, loss = ‘binary_crossentropy’, metrics = [‘accuracy’])**

- For regression problem with root-mean-square error

**model.compile (optimizer = ‘rmsprop’, loss = ‘mse’)**

An example of creating and using your own metric:

```
import keras.backend as K
def mean_pred(y_true, y_pred):
return K.mean(y_pred)
model.compile(optimizer='rmsprop', loss='binary_crossentropy', metrics=['accuracy', mean_pred])
```

**Training the model**

Keras models are trained in arrays of input data and target values. To do this, use the function fit. Consider an example of creating a model for binary classification.

```
model = Sequential()
model.add(Dense(32, activation='relu', input_dim=100))
model.add(Dense(1, activation='sigmoid'))
model.compile(optimizer='rmsprop', loss='binary_crossentropy', metrics=['accuracy'])
# Create dummy data
import numpy as np
data = np.random.random((1000, 100))
labels = np.random.randint(2, size=(1000, 1))
# We train the model in 10 epochs with 32 examples in the batch
model.fit(data, labels, epochs=10, batch_size=32)
```