You can get the logger from the SparkContext object:
log4jLogger = sc._jvm.org.apache.log4j
LOGGER = log4jLogger.LogManager.getLogger(__name__)
LOGGER.info("pyspark script logger initialized")
More Related Contents:
- Where is a complete example of logging.config.dictConfig?
- How to add a custom loglevel to Python’s logging facility
- python exception message capturing
- Can Python’s logging format be modified depending on the message log level?
- How to configure logging to syslog in Python?
- Lazy logger message string evaluation
- How to write custom python logging handler?
- python logging to database
- Suppress newline in Python logging module
- Naming Python loggers
- Python logging: use milliseconds in time format
- Django Setup Default Logging
- How to use logging.getLogger(__name__) in multiple modules
- Run subprocess and print output to logging
- Does python logging flush every log?
- No handlers could be found for logger
- How to use Python’s RotatingFileHandler
- redirect prints to log file
- Set logging levels
- How to change filehandle with Python logging on the fly with different classes and imports
- Can structured logging be done with Pythons standard library?
- Python Logging – Disable logging from imported modules
- Redirect Python ‘print’ output to Logger
- How to find median and quantiles using Spark
- How to export a table dataframe in PySpark to csv?
- Why is Apache-Spark – Python so slow locally as compared to pandas?
- PySpark create new column with mapping from a dict
- Total size of serialized results of 16 tasks (1048.5 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)
- What is the equivalent to scala.util.Try in pyspark?
- Deadlock with logging multiprocess/multithread python script