Skip to content

Commit 2a55698

Browse files
authored
Update README with links and quickstart (#67)
1 parent cfe8ecb commit 2a55698

File tree

1 file changed

+68
-0
lines changed

1 file changed

+68
-0
lines changed

README.md

Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,74 @@ This library is an extension of the core Keras API; all high-level modules
1111
receive that same level of polish as core Keras. If you are familiar with Keras,
1212
congratulations! You already understand most of Keras Recommenders.
1313

14+
## Quick Links
15+
16+
- [Home page](https://keras.io/keras_rs)
17+
- [Examples](https://keras.io/keras_rs/examples)
18+
- [API documentation](https://keras.io/keras_rs/api)
19+
20+
## Quickstart
21+
22+
### Train your own cross network
23+
24+
Choose a backend:
25+
26+
```python
27+
import os
28+
os.environ["KERAS_BACKEND"] = "jax" # Or "tensorflow" or "torch"!
29+
```
30+
31+
Import KerasRS and other libraries:
32+
33+
```python
34+
import keras
35+
import keras_rs
36+
import numpy as np
37+
```
38+
39+
Define a simple model which uses the `FeatureCross` layer and train it:
40+
41+
```python
42+
vocabulary_size = 32
43+
embedding_dim = 6
44+
45+
inputs = keras.Input(shape=(), name='indices', dtype="int32")
46+
x0 = keras.layers.Embedding(
47+
input_dim=vocabulary_size,
48+
output_dim=embedding_dim
49+
)(inputs)
50+
x1 = keras_rs.layers.FeatureCross()(x0, x0)
51+
x2 = keras_rs.layers.FeatureCross()(x0, x1)
52+
output = keras.layers.Dense(units=10)(x2)
53+
model = keras.Model(inputs, output)
54+
55+
# Compile the model
56+
model.compile(
57+
loss=keras.losses.MeanSquaredError(),
58+
optimizer=keras.optimizers.Adam(learning_rate=3e-4)
59+
)
60+
61+
# Call `model.fit()` on dummy data.
62+
batch_size = 2
63+
x = np.random.randint(0, vocabulary_size, size=(batch_size,))
64+
y = np.random.random(size=(batch_size,))
65+
model.fit(input_data, y=y)
66+
```
67+
68+
### Use ranking losses and metrics
69+
70+
If your task is to rank items in a list, you can make use of the ranking losses
71+
and metrics which KerasRS provides. Below, we use the pairwise hinge loss and
72+
track the nDCG metric:
73+
74+
```python
75+
model.compile(
76+
loss=keras_rs.losses.PairwiseHingeLoss(),
77+
metrics=[keras_rs.metrics.NDCG()]
78+
optimizer=keras.optimizers.Adam(learning_rate=3e-4),
79+
)
80+
```
81+
1482
## Installation
1583

1684
Keras Recommenders is available on PyPI as `keras-rs`:

0 commit comments

Comments
 (0)