How to create an Apache Spark 3.0 development cluster on a single machine using Docker
Apache Spark is the most widely used in-memory parallel distributed processing framework in the field of Big Data advanced analytics. The main reasons for its success are the simplicity of use