Lab expts sem 6

Lab expts sem 6 AIML TW1 - DFID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 from collections import defaultdict class Graph: def __init__(self, vertices): self.V = vertices self.graph = defaultdict(list) def addEdge(self, u, v): self.graph[u].append(v) def DLS(self, src, target, maxDepth, visited): visited.append(src) # Add the current node to visited if src == target: return True, visited if maxDepth <= 0: return False, visited for i in self.graph[src]: found, visited = self.DLS(i, target, maxDepth - 1, visited) if found: return True, visited return False, visited def IDDFS(self, src, target, maxDepth): visited = [] # Initialize the visited list for i in range(maxDepth + 1): # Include maxDepth level found, visited = self.DLS(src, target, i, visited) if found: return True, visited return False, visited # Create a graph g = Graph(7) g.addEdge(0, 1) g.addEdge(0, 2) g.addEdge(1, 3) g.addEdge(1, 4) g.addEdge(2, 5) g.addEdge(2, 6) target = 6 maxDepth = 3 src = 0 found, visited = g.IDDFS(src, target, maxDepth) if found: print("Target is reachable from source within max depth") print("Visited nodes:", visited) else: print("Target is NOT reachable from source within max depth") print("Visited nodes:", visited) TW2 - BFS 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 # Python3 Program to print BFS traversal # from a given source vertex. BFS(int s) # traverses vertices reachable from s. from collections import defaultdict # This class represents a directed graph # using adjacency list representation class Graph: # Constructor def __init__(self): # Default dictionary to store graph self.graph = defaultdict(list) # Function to add an edge to graph def addEdge(self, u, v): self.graph[u].append(v) # Function to print a BFS of graph def BFS(self, s): # Mark all the vertices as not visited visited = [False] * (max(self.graph) + 1) # Create a queue for BFS queue = [] # Mark the source node as # visited and enqueue it queue.append(s) visited[s] = True while queue: # Dequeue a vertex from # queue and print it s = queue.pop(0) print(s, end=" ") # Get all adjacent vertices of the # dequeued vertex s. # If an adjacent has not been visited, # then mark it visited and enqueue it for i in self.graph[s]: if not visited[i]: queue.append(i) visited[i] = True # Driver code if __name__ == '__main__': # Create a graph given in # the above diagram g = Graph() g.addEdge(0, 1) g.addEdge(0, 2) g.addEdge(1, 2) g.addEdge(2, 0) g.addEdge(2, 3) g.addEdge(3, 3) print("Following is Breadth First Traversal" " (starting from vertex 2)") g.BFS(2) # This code is contributed by Neelam Yadav # This code is modified by Susobhan Akhuli TW3 - A* algorithm 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 import heapq # tw3 def astar(graph, start, goal): # priority queue to store nodes to be explored open_list = [(0, start)] # dictionary to store parent nodes parents = {} # dictionary to store g values (cost from start node to current node) g_values = {node: float('inf') for node in graph} g_values[start] = 0 # dictionary to store f values (estimated total cost from start to goal) f_values = {node: float('inf') for node in graph} f_values[start] = graph[start][1] iteration = 0 while open_list: # get node with minimum f value current_f, current_node = heapq.heappop(open_list) # check if current node is the goal if current_node == goal: path = [] while current_node in parents: path.append(current_node) current_node = parents[current_node] path.append(start) final_cost = g_values[goal] print(f"\nFinal Cost: {final_cost}") return path[::-1] # explore neighbors for child, cost in graph[current_node][0].items(): # calculate tentative g value tentative_g = g_values[current_node] + cost if tentative_g < g_values[child]: # update parent and g values parents[child] = current_node g_values[child] = tentative_g f_values[child] = tentative_g + graph[child][1] # add child to open list heapq.heappush(open_list, (f_values[child], child)) iteration += 1 print(f"\nIteration {iteration}:") print("Current Path:", reconstruct_path(parents, start, current_node)) print(f"Evaluation Function Value for {current_node}: {f_values[current_node]}") # Function to reconstruct the path from start to goal using parent nodes def reconstruct_path(parents, start, goal): path = [goal] while goal != start: goal = parents[goal] path.append(goal) return path[::-1] # Example usage: start_node = 'A' goal_node = 'G' graph = { 'A': [{'B': 5, 'C': 10}, 10], 'B': [{'D': 5, 'E': 5}, 7], 'C': [{'F': 5}, 7], 'D': [{'G': 10}, 3], 'E': [{'G': 7}, 2], 'F': [{'G': 8}, 1], 'G': [{}, 0] } print("\nA* Search Path:") path = astar(graph, start_node, goal_node) print("Final Path:", path) USP Running a code We shall use fedora to run most of the code. The code will be written in ANSI C and. To run the code, we shall use the following commands: ...

March 23, 2024 · 57 min · 12031 words · Aum Pauskar, Shriram Naik

Image analysis theory

Image processing Theory jargon Difference between supervised and unsupervised learning Criteria Supervised Learning Unsupervised Learning Data Uses labeled data for training. Uses unlabeled data for training. Goal Predict a label for new data based on past observations. Discover hidden patterns or intrinsic structures within the data. Examples Classification, Regression Clustering, Association Complexity Less complex as it has a clear goal. More complex due to the lack of clear goal. Usage When the outcome of the problem is known. When the outcome of the problem is unknown. Application #1 Spam Detection Customer Segmentation Application #2 Credit Fraud Detection Anomaly Detection EM spectrum The Electromagnetic Spectrum (EM) is the range of all types of EM radiation. Radiation is energy that travels and spreads out as it goes – visible light that comes from a lamp in your house or radio waves from a radio station are two types of electromagnetic radiation. Other examples of EM radiation are microwaves, infrared and ultraviolet light, X-rays, and gamma-rays. ...

December 6, 2023 · 6 min · 1102 words · Aum Pauskar

Image analysis with pytorch

Image Analysis using pytorch Prerequisites This project is built using python in Ubuntu (WSL) and you’ll need to install the following: Any bash terminal (one of the following) Conda WSL Mac OS Any flavour of Linux Python 3 (I’m using 3.10.12) 1 2 sudo apt update sudo apt install python3 Pip 1 2 sudo apt update sudo apt install python3-pip Packages Note: Since I’m using a computer with a CUDA compatable NVIDIA GPU, I’ll be using the GPU version of pytorch. If you don’t have a GPU, you can install the CPU version of pytorch given below. CPU install 1 pip3 install torch torchvision numpy matplotlib GPU install Installing numpy and matplotlib 1 pip3 install numpy matplotlib Installing pytorch Check the pytorch website to see the which library is compatable with your system. In my case I’m using CUDA 11.8 1 pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 Jupyter notebook (optional) 1 pip3 install jupyterlab Or just use Jupyter notebook from VS Code from here Environement I’ve used 1 CPU Ryzen 7 5800H 2 GPU RTX 3060 Laptop 3 RAM 2x8GB DDR4 @ 3200MHz 4 OS Windows 11/Ubuntu 22.4 WSL 5 CUDA 11.8 6 Python 3.10.12 MNIST number dataset The MNIST dataset is a dataset of handwritten digits. It has 60,000 training images and 10,000 test images. We’ll see a code to load the dataset and display the occurances of individual digits in the dataset. Or if you want to run the code from Jupyter notebook you can clone my repository via git. ...

December 3, 2023 · 5 min · 1062 words · Aum Pauskar