1. Each of the cities is connected to another city by a road a complete ___ is obtained.
Ans. Decrease-by-one
2. The worst-case efficiency of the brute force algorithm is ___.
Ans. ө(n2)
3. The searching requires ___ comparisons to in the worst case when the array is sorted first.
Ans. [log2 n] +1
4. Gaussian elimination is an algorithm for solving systems of ___ equations.
Ans. Linear
5. In Gaussian elimination we make the ___ coefficient matrix zero.
Ans. Lower triangular
6. Gaussian elimination can be used to find the ___ of a matrix.
Ans. Rank
7. An AVL tree is a ___ tree.
Ans. Binary search
8. The ___ is the mirror image of the RL-rotation.
Ans. LR-rotation
9. The two nodes of 2-3 tree are ___ and ___.
Ans. 2-node, 3-node
10. ___ heap construction algorithm is less efficient than its counterpart.
Ans. Top-down
11. A heap can be defined as ___ with keys assigned to its nodes.
Ans. Binary trees
12. The time efficiency of heapsort is ___ in both worst and average cases.
Ans. O(n log n)
13. Greatest common divisor can be computed easily by ___.
Ans. Euclid’s algorithm
14. The problem of counting graph’s paths can be solved with an algorithm for an appropriate power of its ___.
Ans. Adjacent matrix
16. Input enhancement is based on ___ the instance.
Ans. Preprocessing
17. The cities are represented by ___ and the roads between them are shown as edges.
Ans. Subset
18. Collision occurs when a hash function maps two or more keys to the ___.
Ans. Same hash value
19. When the interval between probes is computed by another hash function it is ___.
Ans. Double hashing
20. As the ___ increases the height of the tree decreases thus speeding access.
Ans. Branching factor
21. Access time increases slowly as the number of records ___.
Ans. Increases
22. The insertions in a B-Tree start from a ___.
Ans. Leaf node
23. Sorting is an example of input enhancement that achieves ___.
Ans. Time efficiency
24. Input enhancement is to ___ the input pattern.
Ans. Preprocess
25. In Horspool’s algorithm, the characters from the pattern are matched ___.
Ans. Right to left
26. The two heuristics in the Boyre-Moore algorithm are ___ and ___.
Ans. Good suffix and bad character shift
27. Each slot of a hash table is often called a ___.
Ans. Bucket
28. The information which is used to place the elements at proper positions is the accumulated sum of frequencies which is called ___.
Ans. Distribution
29. The ___ and ___ are the two approaches to dynamic programming.
Ans. Top-down, bottom-up
30. ___ is a technique to store answers to sub-problems in a table.
Ans. Memoization
31. The ___ algorithm is similar to the dynamic approach.
Ans. Divide and conquer
32. ___, a mathematician, invented the Principle of Optimality.
Ans. Richard Ernest Bellman
33. All optimization problems tend to minimize cost, time and maximizing ___.
Ans. Profits
34. ___are node-based data structures used in many system programming applications for managing dynamic sets.
Ans. Binary search trees
35. The Insertion, deletion and search operations of a binary search tree has an average-case complexity of ___.
Ans. O(log n)
36. The time taken to perform operations on a binary search tree is directly proportional to the ___ of the tree.
Ans. Height
37. The ___ expresses the problem using its sub-instances.
Ans. Recurrence relation
38. ___ is an NP-hard optimization problem.
Ans. Bounded Knapsack problem
39. The Knapsack problem minimizes the total ___ and maximizes the total value.
Ans. Weight
40. The goal of using ___ is to solve only the subproblems which are necessary.
Ans. Memory functions
41. Memory functions use a dynamic programming technique called ___ in order to reduce the inefficiency of recursion that might occur.
Ans. Memoization
42. Memory functions method solves the problem using ___ approach.
Ans. Top-down
43. To carry out an insertion sort, begin at the ___ most element of the array.
Ans. Left
44. The asymptotic running time when we first run to calculate the nth Fibonacci number is ___.
Ans. O(n)
45. To compute the nth Fibonacci number we followed the ___ dynamic approach.
Ans. Bottom-up
46. DFS uses the ___ technique.
Ans. Backtracking
47. Theoretically, the travelling salesman problem can always be solved by specifying all ___ Hamiltonian circuits.
Ans. Combinatorial
48. Binomial coefficients are a study of ___.
Ans. Combinatorics
49. Both Warshall’s and Floyd’s algorithms have the run time as ___.
Ans. θ(n3)
50. Warshall’s algorithm is used to solve ___ problems.
Ans. Transitive closure
51. Floyd’s algorithm is used to solve ___ problems.
Ans. Shortest path
52. The ___ is greedy in the sense that at each iteration it approximates the residual possible by a single function.
Ans. Pure greedy algorithm
53. A greedy strategy usually progresses in a ___ fashion.
Ans. Top-down
54. The ___ is obtained by selecting the adjacent vertices of already selected vertices.
Ans. Minimum spanning tree
55. Each ___ generated by prim’s algorithm is a part of some other minimum spanning tree.
Ans. Sub-tree
56. The greedy strategy in prim’s algorithm is greedy since the tree is added with an edge that contributes the ___ amount possible to the tree’s weight.
Ans. Minimum
57. In Kruskal’s algorithm if the graph is not connected, then the algorithm yields a___.
Ans. Minimum spanning forest
58. The Union-Find data structure is helpful for managing ___ which is vital for Kruskal’s algorithm.
Ans. Equivalence classes
59. Correctness of Kruskal’s algorithm can be proved by saying that the constructed spanning tree is of ___.
Ans. Minimal weight
60. Dijkstra’s algorithm solves the single-source ___ problem for a tree.
Ans. Shortest path
61. The algorithm finds the path with the lowest cost between the ___ vertex and every other vertex.
Ans. Originating
62. The time complexity of Dijkstra’s algorithm can be improved for ___ graphs.
Ans. Sparse
63. Huffman codes are digital ___ codes.
Ans. Data compression
64. The Huffman Encoding scheme falls in the category of ___.
Ans. Variable-length encoding
65. Static Huffman coding is done with the help of ___ tables.
Ans. Statistical symbol frequency
66. Principle of ___ is defined as a basic dynamic programming principle that helps us to view problems as a sequence of subproblems.
Ans. Optimality
67. The choices made in a greedy algorithm cannot depend on ___ choices.
Ans. Future
68. ___ means calculating the minimum amount of work required to solve the problem.
Ans. Lower – bound
69. Trivial lower bound is obtained by the count of the input data that the algorithm reads and the output it produces.
Ans. True
70. ___ method defines the lower bound of an algorithm based on the number of comparisons the algorithm makes.
Ans. Information-theoretic
71. We can implement Backtracking by constructing the ___.
Ans. State-space tree
72. Backtracking, in the ___ case, may have to generate all possible candidates in a problem state that is growing exponentially.
Ans. Worst
73. The n-Queens problem, the ___ circuit and the Subset-Sum problem are some examples of problems that can be solved by Backtracking.
Ans. Hamiltonian
74. ___ organizes details of all candidate solutions and discards large subsets of fruitless candidate solutions.
Ans. Branch and Bound
75. A ___ is a solution that satisfies all the constraints of a problem.
Ans. Decreases
Comments
Post a Comment