Skip to main content

Junior Python Interview Questions

Curated Junior-level Python interview questions for developers targeting junior positions. 20 questions available.

Last updated:

Python Interview Questions & Answers

Skip to Questions

Welcome to our comprehensive collection of Python interview questions and answers. This page contains expertly curated interview questions covering all aspects of Python, from fundamental concepts to advanced topics. Whether you're preparing for an entry-level position or a senior role, you'll find questions tailored to your experience level.

Our Python interview questions are designed to help you:

  • Understand core concepts and best practices in Python
  • Prepare for technical interviews at all experience levels
  • Master both theoretical knowledge and practical application
  • Build confidence for your next Python interview

Each question includes detailed answers and explanations to help you understand not just what the answer is, but why it's correct. We cover topics ranging from basic Python concepts to advanced scenarios that you might encounter in senior-level interviews.

Use the filters below to find questions by difficulty level (Entry, Junior, Mid, Senior, Expert) or focus specifically on code challenges. Each question is carefully crafted to reflect real-world interview scenarios you'll encounter at top tech companies, startups, and MNCs.

Questions

20 questions
Q1:

Explain Python's multiple inheritance and MRO.

Junior

Answer

Python supports multiple inheritance.
MRO (Method Resolution Order) decides method lookup order.
Uses C3 linearization to avoid ambiguity.
Call C.mro() to check the resolution order.
Quick Summary: Python supports multiple inheritance: class C(A, B). MRO (Method Resolution Order) determines which method to call using C3 linearization algorithm. Check with ClassName.__mro__ or ClassName.mro(). super() follows MRO correctly - critical for cooperative multiple inheritance. Diamond problem is resolved by MRO - each class appears once in the resolution order.
Q2:

What are Python magic or dunder methods?

Junior

Answer

Special methods like __init__, __str__, __repr__, __eq__, __add__.
Used for operator overloading and customizing built-in behavior.
Example: obj1 + obj2 calls obj1.__add__(obj2).
Quick Summary: Dunder (double underscore) methods are special methods that Python calls implicitly. __init__ (construction), __str__ (string representation for print), __repr__ (detailed representation), __len__ (len()), __getitem__ (indexing []), __iter__ (iteration), __eq__ (==), __lt__ (<), __add__ (+), __enter__/__exit__ (with statement), __call__ (callable objects). Implementing them makes your classes work naturally with Python built-ins.
Q3:

What are Python metaclasses?

Junior

Answer

Metaclass is the class of a class.
Defines how classes are created.
Useful for validation, auto-registration, or enforcing patterns.
Custom metaclasses extend type.
Quick Summary: Metaclass is the class of a class - controls how classes are created. Default metaclass is type. Custom metaclass: class MyMeta(type): override __new__ or __init__. Use class MyClass(metaclass=MyMeta). Use cases: ORM field registration (Django models), API validation, singleton enforcement, automatic method registration. Metaclasses are powerful but complex - often a class decorator or __init_subclass__ is simpler.
Q4:

Difference between iterators and iterables.

Junior

Answer

Iterable: implements __iter__().
Iterator: implements __next__().
for-loops use iter() and next() internally.
Quick Summary: Iterable: object you can loop over - has __iter__() returning an iterator. Lists, tuples, strings, dicts are iterables. Iterator: object with __next__() - stateful, remembers position, consumed once. Iterables can create multiple iterators. iter(mylist) creates an iterator from a list. Generators are iterators. For-loop calls iter() then repeatedly calls next() until StopIteration.
Q5:

How do generators improve performance?

Junior

Answer

Generators use yield for lazy evaluation.
Memory-efficient for large datasets.
Useful for pipelines and streaming data.
Quick Summary: Generators use lazy evaluation - values produced one at a time on demand, not all at once. Memory efficient: a generator for 1 million numbers uses constant memory vs a list using millions of bytes. Pipelines: chain generators to process data step-by-step (like Unix pipes). yield pauses execution and returns a value; next() resumes from where it left off. Use for large files, infinite sequences, data pipelines.
Q6:

Explain Python decorators in depth.

Junior

Answer

Decorators modify behavior of functions or classes.
Can be parameterized.
Used for logging, caching, auth, timing.
Use @decorator syntax.
Quick Summary: A decorator wraps a function: def my_decorator(func): def wrapper(*args, **kwargs): # before; result = func(*args, **kwargs); # after; return result; return wrapper. Apply with @my_decorator. Decorators with arguments need an extra level of nesting. Class decorators work on classes. functools.wraps preserves metadata. Built-in decorators: @property, @staticmethod, @classmethod, @functools.lru_cache.
Q7:

How does Python handle closures?

Junior

Answer

Closures capture variables from enclosing scopes.
Used in decorators, factories, callbacks.
inner() retains access to outer() variables even after outer() finishes.
Quick Summary: A closure is a nested function that captures and remembers variables from its enclosing scope even after the outer function returns. The enclosed variables form a "cell" object. Closures are the mechanism behind decorators and factory functions. Variables must be declared nonlocal to be assigned inside a closure. Closures in loops: use default argument (i=i) to capture current value.
Q8:

Explain with statement and context managers.

Junior

Answer

Context managers use __enter__ and __exit__.
Automatically handle resource cleanup.
Used for files, network connections, locks.
Can create custom managers using contextlib.
Quick Summary: The with statement ensures resources are properly acquired and released. A context manager implements __enter__ (called when entering with block - can return a value to as clause) and __exit__ (called on exit - even if exception - handles cleanup). Common examples: open() for files, threading.Lock(), database connections. Create custom context managers with contextlib.contextmanager decorator.
Q9:

How do you perform logging in Python?

Junior

Answer

Use logging module.
Levels: DEBUG, INFO, WARNING, ERROR, CRITICAL.
Supports logging to console, files, remote servers.
Preferred over print() in production.
Quick Summary: Python logging: import logging, use logging.getLogger(__name__) to get a logger. Log levels: DEBUG, INFO, WARNING, ERROR, CRITICAL. Configure with basicConfig or a config dict. Add handlers (StreamHandler for console, FileHandler for file, RotatingFileHandler for log rotation). Use formatters to include timestamp, level, module. Prefer logging over print() in any code beyond simple scripts.
Q10:

How do you handle exceptions and create custom exceptions?

Junior

Answer

Use try/except/finally.
Create custom exceptions by subclassing Exception.
raise MyError("msg").
Ensures clean error handling.
Quick Summary: Handle exceptions: try/except/else/finally blocks. Catch specific exceptions (except ValueError as e). Custom exceptions: class ValidationError(Exception): pass - inherit from Exception or a more specific base. Add custom attributes in __init__. Re-raise with raise (no args) inside except. Exception chaining: raise NewError() from original_error. Always clean up in finally.
Q11:

How do you perform unit testing in Python?

Junior

Answer

Use unittest or pytest frameworks.
Mock dependencies using unittest.mock.
Test functions, classes, and APIs.
Quick Summary: Python unit testing: use unittest (built-in) or pytest (more popular). unittest: subclass TestCase, write test_* methods, use self.assertEqual/assertRaises/etc. pytest: just write functions starting with test_, use assert statements (pytest rewrites them for detailed failures). Fixtures, parametrize, and mocking with unittest.mock. Run with pytest or python -m unittest. Aim for 80%+ coverage.
Q12:

Advanced use of *args and **kwargs.

Junior

Answer

*args passes variable positional args.
**kwargs passes variable keyword args.
Used to forward arguments to other functions.
Common in decorators and wrappers.
Quick Summary: Advanced *args/**kwargs: forwarding arguments to another function (func(*args, **kwargs)), combining with keyword-only args (def f(a, b, *, key_only)), unpacking in function calls (f(*list, **dict)). Keyword-only args (after *): def f(a, *, b) forces b to be passed by keyword. Positional-only args (before /): def f(a, b, /) forces a, b to be positional. Both together for fine-grained API control.
Q13:

Explain Python's itertools module.

Junior

Answer

Provides fast, memory-efficient iterators.
Functions: count(), cycle(), combinations(), product().
Useful for looping, combinatorics, pipelines.
Quick Summary: itertools provides efficient iterator building blocks. chain() combines multiple iterables. cycle() repeats infinitely. islice() slices iterators. groupby() groups consecutive elements. product() computes Cartesian product. permutations() and combinations(). count() infinite counter. dropwhile()/takewhile() conditional iterators. All are lazy (memory efficient). Used heavily in data processing pipelines.
Q14:

Explain functools and lru_cache.

Junior

Answer

functools offers higher-order tools.
lru_cache caches results of expensive functions.
Improves performance for recursive or repeated calculations.
Quick Summary: functools module for higher-order functions. lru_cache (Least Recently Used) memoizes a function - caches results keyed by arguments. @lru_cache(maxsize=128) or @cache (Python 3.9, unlimited). Perfect for recursive algorithms (fibonacci) or expensive computations called repeatedly with same inputs. partial() creates new function with pre-filled arguments. reduce() applies function cumulatively.
Q15:

How do you handle file operations with context managers?

Junior

Answer

Use with open("file") for auto-close.
Supports read(), write(), binary modes.
Prevents file descriptor leaks.
Quick Summary: Context managers for file operations: with open("file.txt", "r") as f: data = f.read(). File closes automatically even if an exception occurs. Process files line-by-line to save memory: for line in f: process(line). Use pathlib.Path for modern file path manipulation: Path("dir") / "file.txt". Read CSV with csv module, JSON with json module. Write binary with "wb" mode.
Q16:

How do you connect Python with databases?

Junior

Answer

Use sqlite3, psycopg2, PyMySQL, or ORMs like SQLAlchemy.
Use parameterized queries to prevent SQL injection.
Quick Summary: Python database access: SQLite built-in (sqlite3 module). PostgreSQL/MySQL via psycopg2 or PyMySQL. ORMs: SQLAlchemy (most powerful, any DB) or Django ORM (built into Django). Use parameterized queries to prevent SQL injection (cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))). Always use context managers or try/finally to close connections.
Q17:

How do you implement serialization and deserialization?

Junior

Answer

Use json.dumps()/json.loads().
pickle for Python object serialization.
Avoid untrusted pickle data for security.
Quick Summary: Serialization: convert Python objects to a format for storage or transmission. json module for JSON (human-readable, web-friendly). pickle module for Python-specific binary serialization (any Python object). marshal for Python bytecode. Third-party: marshmallow (validation + serialization), dataclasses with asdict(). Caution: never unpickle data from untrusted sources (security risk).
Q18:

Explain Python's datetime and time modules.

Junior

Answer

datetime provides date/time objects and timedelta.
time handles timestamps and sleep.
Supports formatting and parsing with strptime().
Quick Summary: datetime module: datetime.datetime for date+time, date for date-only, time for time-only, timedelta for durations. strftime(format) converts datetime to string. strptime(string, format) parses string to datetime. Use timezone-aware datetimes: datetime.now(timezone.utc). Python 3.9+ has zoneinfo for IANA timezone support. dateutil library simplifies parsing and timezone handling.
Q19:

How do you implement caching in Python?

Junior

Answer

Use lru_cache or dictionaries for in-memory caching.
Use Redis/Memcached for distributed caching.
Improves performance of repeated calls.
Quick Summary: Caching in Python: functools.lru_cache for in-memory function result caching. dict for simple key-value caching. cachetools library for LRU, TTL, and other cache types. Redis (via redis-py) for distributed caching across processes. Django/Flask caching backends (memcached, Redis). Cache invalidation strategy: TTL-based expiry, event-based invalidation, or versioned cache keys.
Q20:

How do you profile and optimize Python code?

Junior

Answer

Use cProfile or timeit.
Use line profiling for detailed analysis.
Optimize loops, data structures.
Use NumPy or Cython for heavy computation.
Quick Summary: Python profiling tools: cProfile (built-in, function-level timing - python -m cProfile script.py). profile module (pure Python, slower). line_profiler (@profile decorator, line-by-line timing). memory_profiler (memory usage per line). py-spy (sampling profiler, low overhead, production-safe). timeit for benchmarking small code snippets. Always profile before optimizing - don't guess bottlenecks.

Curated Sets for Python

No curated sets yet. Group questions into collections from the admin panel to feature them here.

Ready to level up? Start Practice