Skip to content

Commit

Permalink
Merge branch 'develop' into filename-parser-no-exceptions
Browse files Browse the repository at this point in the history
  • Loading branch information
bhilbert4 authored Aug 16, 2024
2 parents 751cf6f + 025af4e commit f9c6885
Show file tree
Hide file tree
Showing 24 changed files with 919 additions and 345 deletions.
41 changes: 41 additions & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,46 @@
## What's Changed

1.2.10 (2024-07-10)
===================

Duplicate of 1.2.9, caused by changes to versioning conflicts with pypi.


1.2.9 (2024-07-10)
==================

Web Application
~~~~~~~~~~~~~~~
- Add Download CSV button to query page by @BradleySappington in https://github.com/spacetelescope/jwql/pull/1561
- show file anomalies on exposure group page by @BradleySappington in https://github.com/spacetelescope/jwql/pull/1564
- create generic error page to handle exceptions in views. by @BradleySappington in https://github.com/spacetelescope/jwql/pull/1549

Project & API Documentation
~~~~~~~~~~~~~~~~~~~~~~~~~~~
- final model define for faking by @BradleySappington in https://github.com/spacetelescope/jwql/pull/1544
- Update Redis Package Names in Environment Files by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1546
- [SCSB-145] require Python 3.10 by @zacharyburnett in https://github.com/spacetelescope/jwql/pull/1515
- debug false by default by @BradleySappington in https://github.com/spacetelescope/jwql/pull/1550
- Update NIRSpec TA Monitors to use Django DB Models by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1499
- Update NIRSpec TA Models by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1565
- Remove codecov.yml by @bhilbert4 in https://github.com/spacetelescope/jwql/pull/1588
- Remove filename parser test over filesystem by @bhilbert4 in https://github.com/spacetelescope/jwql/pull/1586
- Update remote to upstream in pull_jwql_branch.sh by @bhilbert4 in https://github.com/spacetelescope/jwql/pull/1591
- Add Dependencies for Servers in `pyproject.toml` by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1568
- fix release upload step condition to match workflow trigger by @zacharyburnett in https://github.com/spacetelescope/jwql/pull/1593
- fix environment freeze workflow not picking up tag by @zacharyburnett in https://github.com/spacetelescope/jwql/pull/1594
- fix version matching pattern by @zacharyburnett in https://github.com/spacetelescope/jwql/pull/1595
- updating freeze matrix to include linux, mac and python 3.12 by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1596
- Remove P750L from list of NIRSpec filters by @bhilbert4 in https://github.com/spacetelescope/jwql/pull/1598
- [build] fix `runs-on:` and update build filename for easier parsing by @zacharyburnett in https://github.com/spacetelescope/jwql/pull/1599
- upload to PyPI on release by @zacharyburnett in https://github.com/spacetelescope/jwql/pull/1601
- Updating jwst_reffiles version number by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1606
- Remove old presentations from repo by @bhilbert4 in https://github.com/spacetelescope/jwql/pull/1607
- Num results fix by @BradleySappington in https://github.com/spacetelescope/jwql/pull/1608
- Add Environment Update Script by @mfixstsci in https://github.com/spacetelescope/jwql/pull/1609
- Add new NIRISS AMI-related suffixes by @bhilbert4 in https://github.com/spacetelescope/jwql/pull/1613


1.2.8 (2024-04-18)
==================

Expand Down
116 changes: 50 additions & 66 deletions jwql/instrument_monitors/common_monitors/dark_monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@
import os

from astropy.io import ascii, fits
from astropy.modeling import models
from astropy.modeling.models import Gaussian1D
from astropy.stats import sigma_clipped_stats
from astropy.time import Time
from bokeh.models import ColorBar, ColumnDataSource, HoverTool, Legend
Expand All @@ -92,22 +92,26 @@
from sqlalchemy import func
from sqlalchemy.sql.expression import and_

from jwql.database.database_interface import session, engine
from jwql.database.database_interface import NIRCamDarkQueryHistory, NIRCamDarkPixelStats, NIRCamDarkDarkCurrent
from jwql.database.database_interface import NIRISSDarkQueryHistory, NIRISSDarkPixelStats, NIRISSDarkDarkCurrent
from jwql.database.database_interface import MIRIDarkQueryHistory, MIRIDarkPixelStats, MIRIDarkDarkCurrent
from jwql.database.database_interface import NIRSpecDarkQueryHistory, NIRSpecDarkPixelStats, NIRSpecDarkDarkCurrent
from jwql.database.database_interface import FGSDarkQueryHistory, FGSDarkPixelStats, FGSDarkDarkCurrent
from jwql.instrument_monitors import pipeline_tools
from jwql.shared_tasks.shared_tasks import only_one, run_pipeline, run_parallel_pipeline
from jwql.utils import calculations, instrument_properties, mast_utils, monitor_utils
from jwql.utils.constants import ASIC_TEMPLATES, DARK_MONITOR_BETWEEN_EPOCH_THRESHOLD_TIME, DARK_MONITOR_MAX_BADPOINTS_TO_PLOT
from jwql.utils.constants import JWST_INSTRUMENT_NAMES, FULL_FRAME_APERTURES, JWST_INSTRUMENT_NAMES_MIXEDCASE
from jwql.utils.constants import JWST_DATAPRODUCTS, MINIMUM_DARK_CURRENT_GROUPS, RAPID_READPATTERNS
from jwql.utils.constants import JWST_DATAPRODUCTS, MINIMUM_DARK_CURRENT_GROUPS, ON_GITHUB_ACTIONS, ON_READTHEDOCS, RAPID_READPATTERNS
from jwql.utils.logging_functions import log_info, log_fail
from jwql.utils.permissions import set_permissions
from jwql.utils.utils import copy_files, ensure_dir_exists, get_config, filesystem_path, save_png

if not ON_GITHUB_ACTIONS and not ON_READTHEDOCS:
# Need to set up django apps before we can access the models
import django # noqa: E402 (module level import not at top of file)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "jwql.website.jwql_proj.settings")
django.setup()

# Import * is okay here because this module specifically only contains database models
# for this monitor
from jwql.website.apps.jwql.monitor_models.dark_current import * # noqa: E402 (module level import not at top of file)

THRESHOLDS_FILE = os.path.join(os.path.split(__file__)[0], 'dark_monitor_file_thresholds.txt')


Expand Down Expand Up @@ -230,9 +234,9 @@ def add_bad_pix(self, coordinates, pixel_type, files, mean_filename, baseline_fi
'obs_end_time': observation_end_time,
'mean_dark_image_file': os.path.basename(mean_filename),
'baseline_file': os.path.basename(baseline_filename),
'entry_date': datetime.datetime.now()}
with engine.begin() as connection:
connection.execute(self.pixel_table.__table__.insert(), entry)
'entry_date': datetime.datetime.now(datetime.timezone.utc)}
entry = self.pixel_table(**entry)
entry.save()

def create_mean_slope_figure(self, image, num_files, hotxy=None, deadxy=None, noisyxy=None, baseline_file=None,
min_time='', max_time=''):
Expand Down Expand Up @@ -412,14 +416,15 @@ def exclude_existing_badpix(self, badpix, pixel_type):
raise ValueError('Unrecognized bad pixel type: {}'.format(pixel_type))

logging.info("\t\tRunning database query")
db_entries = session.query(self.pixel_table) \
.filter(self.pixel_table.type == pixel_type) \
.filter(self.pixel_table.detector == self.detector) \
.all()

filters = {"type__iexact": pixel_type,
"detector__iexact": self.detector
}
records = self.pixel_table.objects.filter(**filters).all()

already_found = []
if len(db_entries) != 0:
for _row in db_entries:
if records is not None:
for _row in records:
x_coords = _row.x_coord
y_coords = _row.y_coord
for x, y in zip(x_coords, y_coords):
Expand All @@ -442,7 +447,6 @@ def exclude_existing_badpix(self, badpix, pixel_type):

logging.info("\t\tKeeping {} {} pixels".format(len(new_pixels_x), pixel_type))

session.close()
return (new_pixels_x, new_pixels_y)

def exclude_too_few_groups(self, result_list):
Expand Down Expand Up @@ -521,29 +525,15 @@ def get_baseline_filename(self):
filename : str
Name of fits file containing the baseline image
"""

subq = session.query(self.pixel_table.detector,
func.max(self.pixel_table.entry_date).label('maxdate')
).group_by(self.pixel_table.detector).subquery('t2')

query = session.query(self.pixel_table).join(
subq,
and_(
self.pixel_table.detector == self.detector,
self.pixel_table.entry_date == subq.c.maxdate
)
)

count = query.count()
if not count:
filename = None
else:
filename = query.all()[0].baseline_file
record = self.pixel_table.objects.filter(detector__iexact=self.detector).order_by("-obs_end_time").first()
if record is not None:
filename = record.baseline_file
# Specify the full path
filename = os.path.join(get_config()['outputs'], 'dark_monitor', 'mean_slope_images', filename)
logging.info('Baseline filename: {}'.format(filename))
else:
filename = None

session.close()
return filename

def identify_tables(self):
Expand All @@ -552,9 +542,9 @@ def identify_tables(self):
"""

mixed_case_name = JWST_INSTRUMENT_NAMES_MIXEDCASE[self.instrument]
self.query_table = eval('{}DarkQueryHistory'.format(mixed_case_name))
self.pixel_table = eval('{}DarkPixelStats'.format(mixed_case_name))
self.stats_table = eval('{}DarkDarkCurrent'.format(mixed_case_name))
self.query_table = eval(f'{mixed_case_name}DarkQueryHistory')
self.pixel_table = eval(f'{mixed_case_name}DarkPixelStats')
self.stats_table = eval(f'{mixed_case_name}DarkDarkCurrent')

def most_recent_search(self):
"""Query the query history database and return the information
Expand All @@ -567,23 +557,18 @@ def most_recent_search(self):
Date (in MJD) of the ending range of the previous MAST query
where the dark monitor was run.
"""
query = session.query(self.query_table).filter(self.query_table.aperture == self.aperture,
self.query_table.readpattern == self.readpatt). \
filter(self.query_table.run_monitor == True) # noqa: E348 (comparison to true)

dates = np.zeros(0)
for instance in query:
dates = np.append(dates, instance.end_time_mjd)
filters = {"aperture__iexact": self.aperture,
"readpattern__iexact": self.readpatt,
"run_monitor": True}
record = self.query_table.objects.filter(**filters).order_by("-end_time_mjd").first()

query_count = len(dates)
if query_count == 0:
if record is None:
query_result = 59607.0 # a.k.a. Jan 28, 2022 == First JWST images (MIRI)
logging.info(('\tNo query history for {} with {}. Beginning search date will be set to {}.'
.format(self.aperture, self.readpatt, query_result)))
else:
query_result = np.max(dates)
query_result = record.end_time_mjd

session.close()
return query_result

def noise_check(self, new_noise_image, baseline_noise_image, threshold=1.5):
Expand Down Expand Up @@ -895,12 +880,12 @@ def process(self, file_list):
'double_gauss_width2': double_gauss_params[key][5],
'double_gauss_chisq': double_gauss_chisquared[key],
'mean_dark_image_file': os.path.basename(mean_slope_file),
'hist_dark_values': bins[key],
'hist_amplitudes': histogram[key],
'entry_date': datetime.datetime.now()
'hist_dark_values': list(bins[key]),
'hist_amplitudes': list(histogram[key]),
'entry_date': datetime.datetime.now(datetime.timezone.utc)
}
with engine.begin() as connection:
connection.execute(self.stats_table.__table__.insert(), dark_db_entry)
entry = self.stats_table(**dark_db_entry)
entry.save()

def read_baseline_slope_image(self, filename):
"""Read in a baseline mean slope image and associated standard
Expand Down Expand Up @@ -951,7 +936,7 @@ def run(self):
self.query_end = Time.now().mjd

# Loop over all instruments
for instrument in ['miri', 'nircam']: # JWST_INSTRUMENT_NAMES:
for instrument in JWST_INSTRUMENT_NAMES:
self.instrument = instrument
logging.info(f'\n\nWorking on {instrument}')

Expand Down Expand Up @@ -981,6 +966,7 @@ def run(self):

# Locate the record of the most recent MAST search
self.query_start = self.most_recent_search()

logging.info(f'\tQuery times: {self.query_start} {self.query_end}')

# Query MAST using the aperture and the time of the
Expand Down Expand Up @@ -1124,11 +1110,10 @@ def run(self):
'end_time_mjd': batch_end_time,
'files_found': len(dark_files),
'run_monitor': monitor_run,
'entry_date': datetime.datetime.now()}
'entry_date': datetime.datetime.now(datetime.timezone.utc)}

with engine.begin() as connection:
connection.execute(
self.query_table.__table__.insert(), new_entry)
entry = self.query_table(**new_entry)
entry.save()
logging.info('\tUpdated the query history table')
logging.info('NEW ENTRY: ')
logging.info(new_entry)
Expand All @@ -1146,11 +1131,10 @@ def run(self):
'end_time_mjd': self.query_end,
'files_found': len(new_entries),
'run_monitor': monitor_run,
'entry_date': datetime.datetime.now()}
'entry_date': datetime.datetime.now(datetime.timezone.utc)}

with engine.begin() as connection:
connection.execute(
self.query_table.__table__.insert(), new_entry)
entry = self.query_table(**new_entry)
entry.save()
logging.info('\tUpdated the query history table')
logging.info('NEW ENTRY: ')
logging.info(new_entry)
Expand Down Expand Up @@ -1546,7 +1530,7 @@ def stats_by_amp(self, image, amps):
amplitude, peak, width = calculations.gaussian1d_fit(bin_centers, hist, initial_params)
gaussian_params[key] = [amplitude, peak, width]

gauss_fit_model = models.Gaussian1D(amplitude=amplitude[0], mean=peak[0], stddev=width[0])
gauss_fit_model = Gaussian1D(amplitude=amplitude[0], mean=peak[0], stddev=width[0])
gauss_fit = gauss_fit_model(bin_centers)

positive = hist > 0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1304,7 +1304,7 @@ def plt_mags_time(self):
""",
)
self.date_range.js_on_change("value", callback)
mini_view = CDSView(filters=[self.date_filter])
mini_view = CDSView(filter=self.date_filter)

# create the bokeh plot
plot = figure(
Expand Down Expand Up @@ -1408,14 +1408,18 @@ def setup_date_range(self):
return indices;
""",
)
self.date_view = CDSView(filters=[self.date_filter])
self.date_view = CDSView(filter=self.date_filter)

def mk_plt_layout(self):
"""Create the bokeh plot layout"""
self.query_results = pd.DataFrame(
list(NIRSpecMsataStats.objects.all().values())
)
self.source = ColumnDataSource(data=self.query_results)
def mk_plt_layout(self, plot_data):
"""Create the bokeh plot layout
Parameters
----------
plot_data : pandas.DateFrame
Pandas data frame of data to plot.
"""

self.source = ColumnDataSource(data=plot_data)

# add a time array to the data source
self.add_time_column()
Expand Down Expand Up @@ -1815,8 +1819,14 @@ def run(self):
# Add MSATA data to stats table.
self.add_msata_data()

# Generate plot -- the database is queried in mk_plt_layout().
self.mk_plt_layout()
# Query results and convert into pandas df.
self.query_results = pd.DataFrame(
list(NIRSpecMsataStats.objects.all().values())
)

# Generate plot
self.mk_plt_layout(self.query_results)

logging.info(
"\tNew output plot file will be written as: {}".format(
self.output_file_name
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -797,13 +797,18 @@ def setup_date_range(self):
return indices;
""",
)
self.date_view = CDSView(filters=[filt])
self.date_view = CDSView(filter=filt)

def mk_plt_layout(self):
"""Create the bokeh plot layout"""
self.query_results = pd.DataFrame(list(NIRSpecWataStats.objects.all().values()))
def mk_plt_layout(self, plot_data):
"""Create the bokeh plot layout
self.source = ColumnDataSource(data=self.query_results)
Parameters
----------
plot_data : pandas.DataFrame
Dataframe of data to plot in bokeh
"""

self.source = ColumnDataSource(data=plot_data)

# add a time array to the data source
self.add_time_column()
Expand Down Expand Up @@ -1144,8 +1149,10 @@ def run(self):
# Add WATA data to stats table.
self.add_wata_data()

# Generate plot -- the database is queried in mk_plt_layout().
self.mk_plt_layout()
# Get Results from database table
self.query_results = pd.DataFrame(list(NIRSpecWataStats.objects.all().values()))
# Generate plot.
self.mk_plt_layout(self.query_results)
logging.info(
"\tNew output plot file will be written as: {}".format(
self.output_file_name
Expand Down
Loading

0 comments on commit f9c6885

Please sign in to comment.