The purpose of this tutorial series is to give you an introduction to AI, presenting only the essentials for any user.

A fairly correct definition from my point of view is the one presented on Wikipedia:

*Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by non-human animals and humans…*

Common data structures used in artificial intelligence are:

**Stack**: a one-dimensional data structure with a defined size. This is the common stack.

**Queue**: a one-dimensional data structure with a defined size.

**Linked List**: a one-dimensional data structure with a variable size. Each element, or node, in the linked list, contains a reference to the next element

**Tree**: a multi-dimensional data structure with a variable size where each node in the tree can have one or more child nodes, forming branches.

**Graph**: a multi-dimensional data structure with a variable size consists of a set of vertices (or nodes) and edges connecting them.

**Tensor**: multi-dimensional data structure with a variable size, mostly used in deep learning and numerical computation.

These can be:

** – dense tensors** are the most common type of tensors and are represented by a multi-dimensional array of numbers, where each element in the array has a corresponding value.

** – sparse tensors** are tensors that contain mostly zeros, with only a small number of non-zero values.

** – complex tensor** which is the generalization of tensors that can have complex numbers as entries.

** – hypercomplex tensor** which is a tensor whose entries are hypercomplex numbers.

** – hypercomplex numbers** are not as widely used as complex numbers in AI, but they are used in certain specific applications such as computer vision, robotics, and physics.

**Set**: an unordered collection with a variable size.

**Hash Table**: a data structure that uses a hash function to map keys to values, which can have a variable size depending on the number of elements stored in it.

**Heap**: a multi-dimensional data structure with variable size and typically implemented as binary trees.

**Bloom Filter**: a fixed-size data structure, which is a probabilistic data structure for testing whether an element is a member of a set or not.

**Examples:**

tensor with video data can have a 3-dimensional tensor used to represent a video, where the first dimension represents the time axis, the second dimension represents the height of each frame, and the third dimension represents the width of each frame.

1 2 3 | import torch # 3D tensor representing a video video_tensor = torch.randn(100, 256, 256, 3) |

hypercomplex numbers, which are defined as a + bi + cj + dk, where a, b, c, and d are real numbers and i, j, and k are the three imaginary units can be quaternions to represent rotations in 3D space.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | import numpy as np import quaternion # define a quaternion with real part a and imaginary parts bi, cj, dk a = 1 b = 2 c = 3 d = 4 q = np.quaternion(a, b, c, d) # define a quaternion using euler angles x = 1.0 y = 2.0 z = 3.0 q2 = quaternion.from_euler_angles(x, y, z) # define a vector to rotate v = [1, 0, 0] # perform the rotation using quaternion multiplication # Quaternion multiplication is not commutative, the order matters # this not work: rotated_v = q2 * v * q2.conj() rotated_v = (q2 * quaternion.quaternion(0, *v)) * q2.conj() print(rotated_v) x, y, z = q.to_euler_angles() |

The result of this source code is this: *quaternion(0, 0.103846565151668, 0.422918571742548, 0.900197629735517)*.